no code implementations • 20 May 2019 • Zheng Wang, Jianwu Li, Ge Song, Tieling Li
Self-attention (SA) mechanisms can capture effectively global dependencies in deep neural networks, and have been applied to natural language processing and image processing successfully.