Attention Mechanisms

GeneralAttention • 82 methods

Attention Mechanisms are a component used in neural networks to model long-range interaction, for example across a text in NLP. The key idea is to build shortcuts between a context vector and the input, to allow a model to attend to different parts. Below you can find a continuously updating list of attention mechanisms.

Subcategories

Method Year Papers
2017 17786
2019 1390
2019 1389
2017 264
2017 221
2015 219
2014 196
2017 172
2022 117
2014 106
2020 85
2020 79
2020 78
2020 73
2019 67
2020 66
2019 52
2021 46
2020 43
2018 39
2017 38
2015 33
2014 32
2021 32
2015 31
2015 24
2020 24
2018 24
2019 23
2021 21
2019 20
2015 19
2020 19
2020 17
2018 16
2020 15
2022 14
2020 11
2019 9
2018 9
2019 7
2020 7
2019 6
2018 6
2015 6
2021 5
2018 4
2020 4
2018 3
2016 3
2015 3
2020 3
2017 2
2020 2
2016 2
2018 2
2021 2
2020 2
2017 2
2021 2
2019 2
2020 2
2021 2
2020 1
2021 1
2021 1
2020 1
2018 1
2018 1
2020 1
2022 1
2016 1
2022 1
2020 1
2021 1
2021 1
2020 1
2020 1
2000 0