Attention Mechanisms

Content-based Attention

Introduced by Graves et al. in Neural Turing Machines

Content-based attention is an attention mechanism based on cosine similarity:

$$f_{att}\left(\textbf{h}_{i}, \textbf{s}_{j}\right) = \cos\left[\textbf{h}_{i};\textbf{s}_{j}\right] $$

It was utilised in Neural Turing Machines as part of the Addressing Mechanism.

We produce a normalized attention weighting by taking a softmax over these attention alignment scores.

Source: Neural Turing Machines


Paper Code Results Date Stars


Component Type
🤖 No Components Found You can add them if they exist; e.g. Mask R-CNN uses RoIAlign