no code implementations • 15 Dec 2021 • Zhiqi Lee, Sumin Qi, Chongchong Fan, Ziwei Xie
CA Module (Channel Attention Module) is introduced, which used to extract relevant channels with dependencies and strengthen them by matrix dot product, while weakening irrelevant channels without dependencies.