Methods > General

Attention

Attention is a technique for attending to different parts of an input vector to capture long-term dependencies. Within the context of NLP, traditional sequence-to-sequence models compressed the input sequence to a fixed-length context vector, which hindered their ability to remember long inputs such as sentences. In contrast, attention creates shortcuts between the context vector and the entire source input. Below you will find a continuously updating list of attention based building blocks used in deep learning.

METHOD YEAR PAPERS
Multi-Head Attention
2017 3913
Scaled Dot-Product Attention
2017 3851
Additive Attention
2014 110
Dot-Product Attention
2015 75
Spatial Attention Module
2018 52
SAGAN Self-Attention Module
2018 51
Strided Attention
2019 41
Fixed Factorized Attention
2019 41
Content-based Attention
2014 25
Location-based Attention
2015 24
Channel Attention Module
2018 22
Global and Sliding Window Attention
2020 14
Dilated Sliding Window Attention
2020 14
Sliding Window Attention
2020 14
Channel-wise Soft Attention
2017 14
Location Sensitive Attention
2015 14
LSH Attention
2020 8
LAMA
2019 8
DV3 Attention Block
2017 8
Set Transformer
2018 7
Global Context Block
2019 6
Spatial Attention-Guided Mask
2019 5
Multi-Head Linear Attention
2020 5
CA
2021 5
FGA
2019 5
Adaptive Masking
2019 4
Single-Headed Attention
2019 3
Graph Self-Attention
2019 3
Hopfield Layer
2020 3
Multiplicative Attention
2015 2
Attention-augmented Convolution
2019 2
Routing Attention
2020 2
Point-wise Spatial Attention
2018 2
Bottleneck Transformer Block
2021 2
Triplet Attention
2020 2
Factorized Dense Synthesized Attention
2020 1
Random Synthesized Attention
2020 1
Dense Synthesized Attention
2020 1
Factorized Random Synthesized Attention
2020 1
Compact Global Descriptor
2019 1
CBAM
2018 1
Sparse Sinkhorn Attention
2020 1
SortCut Sinkhorn Attention
2020 1
All-Attention Layer
2019 1
DeLighT Block
2020 1
Feedback Memory
2020 1
3D SA
2020 1
DMA
2020 1
ProCAN
2020 1
Differential attention for visual question answering
2000 0