no code implementations • 10 Mar 2023 • Marin Scalbert, Maria Vakalopoulou, Florent Couzinié-Devy
In Self-Supervised Learning (SSL), models are typically pretrained, fine-tuned, and evaluated on the same domains.
no code implementations • 20 Jun 2022 • Marin Scalbert, Maria Vakalopoulou, Florent Couzinié-Devy
In this paper, to enhance robustness on unseen target protocols, we propose a new test-time data augmentation based on multi domain image-to-image translation.
no code implementations • 30 Jun 2021 • Marin Scalbert, Maria Vakalopoulou, Florent Couzinié-Devy
Multi-Source Unsupervised Domain Adaptation (multi-source UDA) aims to learn a model from several labeled source domains while performing well on a different target domain where only unlabeled data are available at training time.
Contrastive Learning Multi-Source Unsupervised Domain Adaptation +1