Attention Modules

GeneralAttention • 42 methods

Attention Modules refer to modules that incorporate attention mechanisms. For example, multi-head attention is a module that incorporates multiple attention heads. Below you can find a continuously updating list of attention modules.

Method Year Papers
2017 16483
2018 172
2018 135
2021 77
2018 64
2020 62
2019 44
2020 37
2019 35
2020 31
2022 21
2021 20
2018 17
2020 17
2022 12
2019 11
2021 10
2017 9
2019 7
2021 7
2020 7
2019 5
2018 3
2020 3
2020 3
2021 3
2021 3
2021 3
2020 3
2021 3
2019 2
2021 2
2021 2
2019 1
2019 1
2020 1
2020 1
2020 1
2020 1
2021 1
2021 1