Attention Mechanisms

GeneralAttention • 82 methods

Attention Mechanisms are a component used in neural networks to model long-range interaction, for example across a text in NLP. The key idea is to build shortcuts between a context vector and the input, to allow a model to attend to different parts. Below you can find a continuously updating list of attention mechanisms.

Subcategories

Method Year Papers
2017 16258
2019 1202
2019 1201
2017 252
2015 215
2017 209
2014 193
2017 142
2022 100
2014 99
2020 79
2020 75
2020 75
2020 66
2019 64
2020 63
2019 52
2020 40
2021 39
2018 36
2017 35
2014 32
2021 32
2015 31
2015 31
2015 24
2018 23
2020 23
2019 21
2021 20
2015 19
2020 18
2020 17
2019 16
2018 15
2020 14
2020 11
2022 11
2018 9
2019 8
2019 7
2020 7
2019 6
2018 6
2015 5
2021 5
2018 4
2020 4
2018 3
2016 3
2015 3
2020 3
2017 2
2020 2
2016 2
2018 2
2021 2
2020 2
2017 2
2021 2
2019 2
2020 2
2021 2
2020 1
2021 1
2021 1
2020 1
2018 1
2018 1
2020 1
2022 1
2016 1
2022 1
2020 1
2021 1
2021 1
2020 1
2020 1
2000 0