Attention Modules

GeneralAttention • 42 methods

Attention Modules refer to modules that incorporate attention mechanisms. For example, multi-head attention is a module that incorporates multiple attention heads. Below you can find a continuously updating list of attention modules.

Method Year Papers
2017 19930
2018 188
2018 136
2021 100
2020 80
2018 79
2019 51
2020 46
2020 40
2019 37
2022 29
2018 27
2021 24
2020 18
2022 15
2021 12
2019 11
2017 9
2021 8
2020 8
2019 7
2019 5
2021 5
2021 4
2020 4
2019 3
2018 3
2020 3
2020 3
2021 3
2021 3
2021 2
2021 2
2019 1
2019 1
2020 1
2020 1
2020 1
2020 1
2021 1
2021 1