Attention Mechanisms

GeneralAttention • 47 methods

Attention Mechanisms are a component used in neural networks to model long-range interaction, for example across a text in NLP. The key idea is to build shortcuts between a context vector and the input, to allow a model to attend to different parts. Below you can find a continuously updating list of attention mechanisms.

Subcategories

Method Year Papers
2017 6009
2014 144
2019 100
2019 100
2015 95
2019 41
2014 25
2015 25
2020 24
2020 23
2020 23
2021 21
2020 19
2020 19
2015 17
2017 17
2020 11
2020 10
2018 9
2019 5
2021 5
2020 4
2020 4
2015 3
2018 3
2020 2
2018 2
2020 2
2020 2
2021 1
2020 1
2021 1
2021 1
2020 1
2021 1
2020 1
2020 1
2021 1
2021 1
2020 1
2020 1
2020 1
2021 1
2000 0