no code implementations • 31 Oct 2022 • Zhenzhe Hechen, Wei Huang, Yixin Zhao
Consequently, this paper presents a light self-limited-attention (LSLA) consisting of a light self-attention mechanism (LSA) to save the computation cost and the number of parameters, and a self-limited-attention mechanism (SLA) to improve the performance.