RoBERTa is an extension of BERT with changes to the pretraining procedure. The modifications include:
Paper | Code | Results | Date | Stars |
---|
Task | Papers | Share |
---|---|---|
Language Modelling | 76 | 8.96% |
Sentence | 56 | 6.60% |
Sentiment Analysis | 42 | 4.95% |
Text Classification | 33 | 3.89% |
Question Answering | 33 | 3.89% |
Classification | 24 | 2.83% |
Named Entity Recognition (NER) | 19 | 2.24% |
NER | 18 | 2.12% |
Natural Language Understanding | 16 | 1.89% |