Attention Modules

GeneralAttention • 42 methods

Attention Modules refer to modules that incorporate attention mechanisms. For example, multi-head attention is a module that incorporates multiple attention heads. Below you can find a continuously updating list of attention modules.

Method Year Papers
2017 17491
2018 178
2018 135
2021 82
2020 70
2018 69
2019 48
2020 39
2019 37
2020 32
2021 23
2022 23
2018 20
2020 17
2022 14
2019 11
2021 11
2017 9
2021 8
2019 7
2020 7
2019 5
2021 4
2021 4
2020 4
2018 3
2020 3
2020 3
2021 3
2021 3
2019 2
2021 2
2021 2
2019 1
2019 1
2020 1
2020 1
2020 1
2020 1
2021 1
2021 1