Transformers

GPT is a Transformer-based architecture and training procedure for natural language processing tasks. Training follows a two-stage procedure. First, a language modeling objective is used on the unlabeled data to learn the initial parameters of a neural network model. Subsequently, these parameters are adapted to a target task using the corresponding supervised objective.

Source: Improving Language Understanding by Generative Pre-Training

Papers


Paper Code Results Date Stars

Tasks


Task Papers Share
Language Modelling 74 8.79%
Large Language Model 43 5.11%
Question Answering 35 4.16%
Text Generation 28 3.33%
Retrieval 24 2.85%
Prompt Engineering 23 2.73%
Sentence 18 2.14%
Fairness 17 2.02%
RAG 15 1.78%

Categories