Attention Modules

GeneralAttention • 39 methods

Attention Modules refer to modules that incorporate attention mechanisms. For example, multi-head attention is a module that incorporates multiple attention heads. Below you can find a continuously updating list of attention modules.

Method Year Papers
2017 6107
2018 77
2018 62
2018 27
2019 16
2020 16
2020 9
2019 9
2017 8
2020 8
2019 6
2021 6
2020 6
2019 5
2019 4
2021 4
2020 3
2020 3
2021 3
2019 2
2018 2
2018 2
2021 2
2021 2
2019 1
2019 1
2020 1
2020 1
2021 1
2021 1
2020 1
2020 1
2021 1
2021 1
2020 1
2020 1
2021 1
2021 1