no code implementations • 2 Mar 2022 • Danial Maleki, H. R Tizhoosh
In this study, self-attention as an additional loss term will be proposed to enrich the internal representation provided into the cross attention module.
Ranked #28 on Cross-Modal Retrieval on COCO 2014