Search Results for author: Victor Bouvier

Found 10 papers, 2 papers with code

Test-Time Adaptation with Principal Component Analysis

no code implementations13 Sep 2022 Thomas Cordier, Victor Bouvier, Gilles Hénaff, Céline Hudelot

Machine Learning models are prone to fail when test data are different from training data, a situation often encountered in real applications known as distribution shift.

Test-time Adaptation valid

Towards Clear Expectations for Uncertainty Estimation

no code implementations27 Jul 2022 Victor Bouvier, Simona Maggio, Alexandre Abraham, Léo Dreyfus-Schmidt

If Uncertainty Quantification (UQ) is crucial to achieve trustworthy Machine Learning (ML), most UQ methods suffer from disparate and inconsistent evaluation protocols.

Uncertainty Quantification

Performance Prediction Under Dataset Shift

1 code implementation21 Jun 2022 Simona Maggio, Victor Bouvier, Léo Dreyfus-Schmidt

ML models deployed in production often have to face unknown domain changes, fundamentally different from their training settings.

Bridging Few-Shot Learning and Adaptation: New Challenges of Support-Query Shift

1 code implementation25 May 2021 Etienne Bennequin, Victor Bouvier, Myriam Tami, Antoine Toubhans, Céline Hudelot

To classify query instances from novel classes encountered at test-time, they only require a support set composed of a few labelled samples.

Few-Shot Learning Novel Concepts +1

Stochastic Adversarial Gradient Embedding for Active Domain Adaptation

no code implementations3 Dec 2020 Victor Bouvier, Philippe Very, Clément Chastagnol, Myriam Tami, Céline Hudelot

First, we select for annotation target samples that are likely to improve the representations' transferability by measuring the variation, before and after annotation, of the transferability loss gradient.

Active Learning Unsupervised Domain Adaptation

Target Consistency for Domain Adaptation: when Robustness meets Transferability

no code implementations25 Jun 2020 Yassine Ouali, Victor Bouvier, Myriam Tami, Céline Hudelot

Learning Invariant Representations has been successfully applied for reconciling a source and a target domain for Unsupervised Domain Adaptation.

Image Classification Unsupervised Domain Adaptation

Robust Domain Adaptation: Representations, Weights and Inductive Bias

no code implementations24 Jun 2020 Victor Bouvier, Philippe Very, Clément Chastagnol, Myriam Tami, Céline Hudelot

The emergence of Domain Invariant Representations (IR) has improved drastically the transferability of representations from a labelled source domain to a new and unlabelled target domain.

Inductive Bias Unsupervised Domain Adaptation

Domain-Invariant Representations: A Look on Compression and Weights

no code implementations25 Sep 2019 Victor Bouvier, Céline Hudelot, Clément Chastagnol, Philippe Very, Myriam Tami

Second, we show that learning weighted representations plays a key role in relaxing the constraint of invariance and then preserving the risk of compression.

Domain Adaptation

Learning Invariant Representations for Sentiment Analysis: The Missing Material is Datasets

no code implementations29 Jul 2019 Victor Bouvier, Philippe Very, Céline Hudelot, Clément Chastagnol

Learning representations which remain invariant to a nuisance factor has a great interest in Domain Adaptation, Transfer Learning, and Fair Machine Learning.

Domain Adaptation Sentiment Analysis +3

Hidden Covariate Shift: A Minimal Assumption For Domain Adaptation

no code implementations29 Jul 2019 Victor Bouvier, Philippe Very, Céline Hudelot, Clément Chastagnol

Such approach consists in learning a representation of the data such that the label distribution conditioned on this representation is domain invariant.

Unsupervised Domain Adaptation

Cannot find the paper you are looking for? You can Submit a new open access paper.