Search Results for author: JingJie Wang

Found 1 papers, 0 papers with code

LMSA: Low-relation Mutil-head Self-Attention Mechanism in Visual Transformer

no code implementations29 Sep 2021 JingJie Wang, Xiang Wei, Xiaoyu Liu

By appropriately compressing the dimensions of the self-attention relationship variables, the Transformer network can be more efficient and even perform better.

Image Classification Relation

Cannot find the paper you are looking for? You can Submit a new open access paper.