Attention

General • 115 methods

Attention is a technique for attending to different parts of an input vector to capture long-term dependencies. Within the context of NLP, traditional sequence-to-sequence models compressed the input sequence to a fixed-length context vector, which hindered their ability to remember long inputs such as sentences. In contrast, attention creates shortcuts between the context vector and the entire source input. Below you will find a continuously updating list of attention based building blocks used in deep learning.

Subcategories

Method Year Papers
2017 7659
2017 7537
2019 163
2019 163
2014 159
2017 148
2017 117
2015 113
2018 99
2017 71
2018 70
2021 64
2019 42
2014 40
2020 34
2018 34
2020 33
2020 33
2020 33
2019 30
2014 28
2020 25
2015 25
2015 23
2021 22
2020 22
2017 21
2018 21
2019 20
2015 19
2015 18
2019 17
2020 14
2018 13
2020 13
2021 12
2020 12
2019 11
2020 11
2018 11
2020 10
2019 9
2020 8
2017 8
2020 8
2019 7
2021 7
2021 7
2021 7
2018 6
2019 5
2019 5
2017 5
2019 5
2020 4
2019 4
2018 4
2018 4
2021 4
2019 4
2020 4
2021 3
2020 3
2020 3
2020 3
2020 3
2018 3
2015 3
2020 3
2018 2
2
2017 2
2018 2
2019 2
2016 2
2021 2
2021 2
2019 2
2021 1
2017 1
2020 1
2018 1
2020 1
2020 1
2020 1
2018 1
2021 1
2018 1
2021 1
2021 1
2020 1
2021 1
2016 1
2019 1
2020 1
2021 1
2021 1
2021 1
2020 1
2021 1
2021 1
2020 1
2019 1
2021 1
2015 1
2016 1
2020 1
2019 1
2020 1
2020 1
2020 1
2020 1
2021 1
2020 1
2020 1
2021 1
2000 0