Transformers

Linformer

Introduced by Wang et al. in Linformer: Self-Attention with Linear Complexity

Linformer is a linear Transformer that utilises a linear self-attention mechanism to tackle the self-attention bottleneck with Transformer models. The original scaled dot-product attention is decomposed into multiple smaller attentions through linear projections, such that the combination of these operations forms a low-rank factorization of the original attention.

Source: Linformer: Self-Attention with Linear Complexity

Papers


Paper Code Results Date Stars

Tasks


Categories