Attention Mechanisms

Location-based Attention

Introduced by Luong et al. in Effective Approaches to Attention-based Neural Machine Translation

Location-based Attention is an attention mechanism in which the alignment scores are computed from solely the target hidden state $\mathbf{h}_{t}$ as follows:

$$ \mathbf{a}_{t} = \text{softmax}(\mathbf{W}_{a}\mathbf{h}_{t}) $$

Source: Effective Approaches to Attention-based Neural Machine Translation

Papers


Paper Code Results Date Stars

Tasks


Task Papers Share
Question Answering 6 9.09%
Retrieval 5 7.58%
Translation 5 7.58%
Machine Translation 4 6.06%
Sentence 3 4.55%
Language Modelling 3 4.55%
Text Generation 2 3.03%
Automatic Speech Recognition (ASR) 2 3.03%
Speech Recognition 2 3.03%

Components


Component Type
Softmax
Output Functions

Categories