RoBERTa is an extension of BERT with changes to the pretraining procedure. The modifications include:
Paper | Code | Results | Date | Stars |
---|
Task | Papers | Share |
---|---|---|
Language Modelling | 72 | 8.25% |
Sentence | 49 | 5.61% |
Sentiment Analysis | 44 | 5.04% |
Question Answering | 37 | 4.24% |
Text Classification | 36 | 4.12% |
Classification | 22 | 2.52% |
NER | 19 | 2.18% |
parameter-efficient fine-tuning | 17 | 1.95% |
Decoder | 16 | 1.83% |