Attention Modules

GeneralAttention • 42 methods

Attention Modules refer to modules that incorporate attention mechanisms. For example, multi-head attention is a module that incorporates multiple attention heads. Below you can find a continuously updating list of attention modules.

Method Year Papers
2017 9650
2018 128
2018 80
2018 43
2020 32
2021 29
2019 27
2019 21
2020 17
2020 14
2020 14
2019 12
2018 9
2017 9
2021 9
2019 7
2021 6
2022 6
2019 5
2020 4
2022 4
2018 3
2020 3
2021 3
2019 2
2020 2
2021 2
2021 2
2021 2
2019 1
2019 1
2020 1
2021 1
2020 1
2020 1
2021 1
2020 1
2020 1
2021 1
2021 1
2021 1