MobileBERT is a type of inverted-bottleneck BERT that compresses and accelerates the popular BERT model. MobileBERT is a thin version of BERT_LARGE, while equipped with bottleneck structures and a carefully designed balance between self-attentions and feed-forward networks. To train MobileBERT, we first train a specially designed teacher model, an inverted-bottleneck incorporated BERT_LARGE model. Then, we conduct knowledge transfer from this teacher to MobileBERT. Like the original BERT, MobileBERT is task-agnostic, that is, it can be generically applied to various downstream NLP tasks via simple fine-tuning. It is trained by layer-to-layer imitating the inverted bottleneck BERT.
Source: MobileBERT: a Compact Task-Agnostic BERT for Resource-Limited DevicesPaper | Code | Results | Date | Stars |
---|
Task | Papers | Share |
---|---|---|
Language Modelling | 2 | 15.38% |
Large Language Model | 1 | 7.69% |
Phishing Website Detection | 1 | 7.69% |
Bayesian Optimization | 1 | 7.69% |
Model Compression | 1 | 7.69% |
Hate Speech Detection | 1 | 7.69% |
Intent Detection | 1 | 7.69% |
Natural Language Understanding | 1 | 7.69% |
Sentence | 1 | 7.69% |