Search Results for author: Antoine Saporta

Found 5 papers, 3 papers with code

Multi-Head Distillation for Continual Unsupervised Domain Adaptation in Semantic Segmentation

1 code implementation25 Apr 2022 Antoine Saporta, Arthur Douillard, Tuan-Hung Vu, Patrick Pérez, Matthieu Cord

Unsupervised Domain Adaptation (UDA) is a transfer learning task which aims at training on an unlabeled target domain by leveraging a labeled source domain.

Continual Learning Semantic Segmentation +2

Multi-Target Adversarial Frameworks for Domain Adaptation in Semantic Segmentation

1 code implementation ICCV 2021 Antoine Saporta, Tuan-Hung Vu, Matthieu Cord, Patrick Pérez

In this work, we address the task of unsupervised domain adaptation (UDA) for semantic segmentation in presence of multiple target domains: The objective is to train a single model that can handle all these domains at test time.

Segmentation Semantic Segmentation +2

Confidence Estimation via Auxiliary Models

no code implementations11 Dec 2020 Charles Corbière, Nicolas Thome, Antoine Saporta, Tuan-Hung Vu, Matthieu Cord, Patrick Pérez

In this paper, we introduce a novel target criterion for model confidence, namely the true class probability (TCP).

Domain Adaptation Image Classification +1

ESL: Entropy-guided Self-supervised Learning for Domain Adaptation in Semantic Segmentation

1 code implementation15 Jun 2020 Antoine Saporta, Tuan-Hung Vu, Matthieu Cord, Patrick Pérez

While fully-supervised deep learning yields good models for urban scene semantic segmentation, these models struggle to generalize to new environments with different lighting or weather conditions for instance.

Self-Supervised Learning Semantic Segmentation +1

REVE: Regularizing Deep Learning with Variational Entropy Bound

no code implementations15 Oct 2019 Antoine Saporta, Yifu Chen, Michael Blot, Matthieu Cord

Studies on generalization performance of machine learning algorithms under the scope of information theory suggest that compressed representations can guarantee good generalization, inspiring many compression-based regularization methods.

Cannot find the paper you are looking for? You can Submit a new open access paper.