Adaptive Input Embeddings extend the adaptive softmax to input word representations. The factorization assigns more capacity to frequent words and reduces the capacity for less frequent words with the benefit of reducing overfitting to rare words.
Source: Adaptive Input Representations for Neural Language ModelingPaper | Code | Results | Date | Stars |
---|
Task | Papers | Share |
---|---|---|
Language Modelling | 38 | 38.78% |
Machine Translation | 6 | 6.12% |
Speech Recognition | 5 | 5.10% |
Paraphrase Identification | 3 | 3.06% |
Text Generation | 3 | 3.06% |
Automatic Speech Recognition (ASR) | 3 | 3.06% |
Reinforcement Learning (RL) | 2 | 2.04% |
Music Generation | 2 | 2.04% |
Abstractive Text Summarization | 2 | 2.04% |
Component | Type |
|
---|---|---|
🤖 No Components Found | You can add them if they exist; e.g. Mask R-CNN uses RoIAlign |