Attention Mechanisms

GeneralAttention • 86 methods

Attention Mechanisms are a component used in neural networks to model long-range interaction, for example across a text in NLP. The key idea is to build shortcuts between a context vector and the input, to allow a model to attend to different parts. Below you can find a continuously updating list of attention mechanisms.

Subcategories

Method Year Papers
2017 22497
2019 1793
2019 1792
2013 1200
2017 314
2017 285
2017 239
2015 237
2014 203
2022 157
2014 138
2020 95
2020 93
2020 83
2020 82
2020 82
2019 81
2021 67
2019 57
2020 52
2017 48
2018 43
2015 39
2021 35
2015 35
2014 32
2020 32
2018 31
2019 30
2020 30
2021 28
2015 25
2019 24
2015 22
2020 20
2022 18
2018 17
2020 16
2020 13
2019 11
2020 11
2018 10
2019 8
2018 8
2021 7
2019 7
2015 6
2020 5
2018 5
2019 5
2018 3
2016 3
2015 3
2020 3
2017 2
2020 2
2016 2
2018 2
2021 2
2020 2
2017 2
2021 2
2020 2
2021 2
2016 2
2020 1
2021 1
2000 1
2021 1
2020 1
2018 1
2018 1
2020 1
2024 1
2022 1
2022 1
2020 1
2021 1
2021 1
2020 1
2020 1
2000 0
2000 0