no code implementations • 21 Aug 2022 • Junghun Kim, Yoojin An, Jihie Kim
To improve the attention area, we propose to use a Focus-Attention (FA) mechanism and a novel Calibration-Attention (CA) mechanism in combination with the multi-head self-attention.