Entropy Minimized Ensemble of Adapters, or EMEA, is a method that optimizes the ensemble weights of the pretrained language adapters for each test sentence by minimizing the entropy of its predictions. The intuition behind the method is that a good adapter weight $\alpha$ for a test input $x$ should make the model more confident in its prediction for $x$, that is, it should lead to lower model entropy over the input
Source: Efficient Test Time Adapter Ensembling for Low-resource Language VarietiesPaper | Code | Results | Date | Stars |
---|
Task | Papers | Share |
---|---|---|
Machine Translation | 1 | 20.00% |
NMT | 1 | 20.00% |
Cross-Lingual Transfer | 1 | 20.00% |
Named Entity Recognition (NER) | 1 | 20.00% |
Part-Of-Speech Tagging | 1 | 20.00% |
Component | Type |
|
---|---|---|
🤖 No Components Found | You can add them if they exist; e.g. Mask R-CNN uses RoIAlign |