Attention Modules

Attention Modules refer to modules that incorporate attention mechanisms. For example, multi-head attention is a module that incorporates multiple attention heads. Below you can find a continuously updating list of attention modules.

METHOD YEAR PAPERS
Multi-Head Attention
2017 2287
SAGAN Self-Attention Module
2018 33
Spatial Attention Module
2018 19
Channel Attention Module
2018 10
DV3 Attention Block
2017 9
Spatial Attention-Guided Mask
2019 6
Single-Headed Attention
2019 2
Point-wise Spatial Attention
2018 2
Attention-augmented Convolution
2019 1
Global Context Block
2019 1
CBAM
2018 1
Multi-Head Linear Attention
2020 1
Graph Self-Attention
2019 1
LAMA
2019 1
Compact Global Descriptor
2019 1