GPT-4 is a transformer based model pre-trained to predict the next token in a document.
Source: GPT-4 Technical ReportPaper | Code | Results | Date | Stars |
---|
Task | Papers | Share |
---|---|---|
Language Modelling | 80 | 11.82% |
Large Language Model | 48 | 7.09% |
Question Answering | 40 | 5.91% |
Retrieval | 27 | 3.99% |
In-Context Learning | 25 | 3.69% |
Code Generation | 18 | 2.66% |
Benchmarking | 16 | 2.36% |
Sentence | 14 | 2.07% |
Decision Making | 13 | 1.92% |