1 code implementation • 14 Mar 2024 • Melanie Roschewitz, Fabio De Sousa Ribeiro, Tian Xia, Galvin Khara, Ben Glocker
Contrastive pretraining is well-known to improve downstream task performance and model generalisation, especially in limited label settings.