RoBERTa is an extension of BERT with changes to the pretraining procedure. The modifications include:
Paper | Code | Results | Date | Stars |
---|
Task | Papers | Share |
---|---|---|
Language Modelling | 74 | 9.26% |
Sentiment Analysis | 38 | 4.76% |
Question Answering | 33 | 4.13% |
Text Classification | 31 | 3.88% |
Classification | 25 | 3.13% |
Natural Language Understanding | 24 | 3.00% |
Named Entity Recognition (NER) | 20 | 2.50% |
NER | 16 | 2.00% |
Test | 16 | 2.00% |