Attention Modules

GeneralAttention • 42 methods

Attention Modules refer to modules that incorporate attention mechanisms. For example, multi-head attention is a module that incorporates multiple attention heads. Below you can find a continuously updating list of attention modules.

Method Year Papers
2017 17685
2018 179
2018 136
2021 83
2020 71
2018 70
2019 48
2020 39
2019 37
2020 34
2021 23
2022 23
2018 20
2020 17
2022 14
2019 11
2021 11
2017 9
2021 8
2019 7
2020 7
2019 5
2021 4
2021 4
2020 4
2018 3
2020 3
2020 3
2021 3
2021 3
2019 2
2021 2
2021 2
2019 1
2019 1
2020 1
2020 1
2020 1
2020 1
2021 1
2021 1