Attention Modules

General • 21 methods

Attention Modules refer to modules that incorporate attention mechanisms. For example, multi-head attention is a module that incorporates multiple attention heads. Below you can find a continuously updating list of attention modules.

Method Year Papers
2017 4228
2018 60
2018 51
2018 23
2019 9
2017 8
2019 6
2020 6
2019 5
2019 3
2019 3
2020 3
2019 2
2018 2
2021 2
2020 2
2018 1
2019 1
2019 1
2020 1
2020 1