Location-based Attention is an attention mechanism in which the alignment scores are computed from solely the target hidden state $\mathbf{h}_{t}$ as follows:
$$ \mathbf{a}_{t} = \text{softmax}(\mathbf{W}_{a}\mathbf{h}_{t}) $$
Source: Effective Approaches to Attention-based Neural Machine TranslationPaper | Code | Results | Date | Stars |
---|
Task | Papers | Share |
---|---|---|
Question Answering | 6 | 6.67% |
Retrieval | 5 | 5.56% |
Translation | 5 | 5.56% |
Decoder | 4 | 4.44% |
Machine Translation | 4 | 4.44% |
Sentence | 3 | 3.33% |
Language Modeling | 3 | 3.33% |
Language Modelling | 3 | 3.33% |
Denoising | 2 | 2.22% |