1 code implementation • 8 Dec 2021 • Karam Park, Jae Woong Soh, Nam Ik Cho
We also propose a residual self-attention (RSA) module to further boost the performance, which produces 3-dimensional attention maps without additional parameters by cooperating with residual structures.