Attention Mechanisms

GeneralAttention • 81 methods

Attention Mechanisms are a component used in neural networks to model long-range interaction, for example across a text in NLP. The key idea is to build shortcuts between a context vector and the input, to allow a model to attend to different parts. Below you can find a continuously updating list of attention mechanisms.

Subcategories

Method Year Papers
2017 9702
2019 289
2019 289
2017 179
2014 169
2017 137
2015 136
2017 71
2014 54
2019 46
2020 44
2020 43
2020 43
2020 41
2019 40
2020 30
2014 29
2018 29
2015 29
2020 28
2017 27
2021 27
2015 26
2022 25
2015 20
2015 19
2018 15
2020 15
2020 15
2019 14
2021 14
2020 12
2018 11
2021 10
2020 8
2019 8
2022 6
2019 6
2020 6
2018 6
2019 6
2019 5
2020 4
2018 4
2015 4
2021 4
2018 3
2020 3
2020 3
2020 2
2018 2
2017 2
2016 2
2017 2
2015 2
2016 2
2021 1
2020 1
2021 1
2020 1
2018 1
2021 1
2020 1
2018 1
2018 1
2021 1
2020 1
2022 1
2016 1
2019 1
2020 1
2020 1
2020 1
2021 1
2021 1
2020 1
2021 1
2000 0