Attention Modules

GeneralAttention • 42 methods

Attention Modules refer to modules that incorporate attention mechanisms. For example, multi-head attention is a module that incorporates multiple attention heads. Below you can find a continuously updating list of attention modules.

Method Year Papers
2017 14660
2018 165
2018 135
2021 63
2018 57
2020 51
2019 39
2020 35
2019 33
2020 30
2021 19
2020 17
2018 15
2022 14
2019 11
2021 10
2022 10
2017 9
2019 7
2019 5
2021 5
2020 5
2018 3
2020 3
2021 3
2021 3
2021 3
2020 3
2021 3
2019 2
2020 2
2021 2
2019 1
2019 1
2020 1
2020 1
2020 1
2021 1
2020 1
2021 1
2021 1