Search Results for author: Céline Hudelot

Found 39 papers, 14 papers with code

An Overview of Deep Semi-Supervised Learning

1 code implementation9 Jun 2020 Yassine Ouali, Céline Hudelot, Myriam Tami

Deep neural networks demonstrated their ability to provide remarkable performances on a wide range of supervised learning tasks (e. g., image classification) when trained on extensive collections of labeled data (e. g., ImageNet).

Image Classification

Semi-Supervised Semantic Segmentation with Cross-Consistency Training

5 code implementations CVPR 2020 Yassine Ouali, Céline Hudelot, Myriam Tami

To leverage the unlabeled examples, we enforce a consistency between the main decoder predictions and those of the auxiliary decoders, taking as inputs different perturbed versions of the encoder's output, and consequently, improving the encoder's representations.

Semi-Supervised Semantic Segmentation

Spatial Contrastive Learning for Few-Shot Classification

1 code implementation26 Dec 2020 Yassine Ouali, Céline Hudelot, Myriam Tami

In this paper, we explore contrastive learning for few-shot classification, in which we propose to use it as an additional auxiliary training objective acting as a data-dependent regularizer to promote more general and transferable features.

Classification Contrastive Learning +2

Open-Set Likelihood Maximization for Few-Shot Learning

1 code implementation CVPR 2023 Malik Boudiaf, Etienne Bennequin, Myriam Tami, Antoine Toubhans, Pablo Piantanida, Céline Hudelot, Ismail Ben Ayed

We tackle the Few-Shot Open-Set Recognition (FSOSR) problem, i. e. classifying instances among a set of classes for which we only have a few labeled samples, while simultaneously detecting instances that do not belong to any known class.

Few-Shot Image Classification Few-Shot Learning +2

Autoregressive Unsupervised Image Segmentation

1 code implementation ECCV 2020 Yassine Ouali, Céline Hudelot, Myriam Tami

In this work, we propose a new unsupervised image segmentation approach based on mutual information maximization between different constructed views of the inputs.

Clustering Image Segmentation +5

Controlling generative models with continuous factors of variations

1 code implementation ICLR 2020 Antoine Plumerault, Hervé Le Borgne, Céline Hudelot

Recent deep generative models are able to provide photo-realistic images as well as visual or textual content embeddings useful to address various tasks of computer vision and natural language processing.

Translation

Bridging Few-Shot Learning and Adaptation: New Challenges of Support-Query Shift

1 code implementation25 May 2021 Etienne Bennequin, Victor Bouvier, Myriam Tami, Antoine Toubhans, Céline Hudelot

To classify query instances from novel classes encountered at test-time, they only require a support set composed of a few labelled samples.

Few-Shot Learning Novel Concepts +1

Towards Job-Transition-Tag Graph for a Better Job Title Representation Learning

1 code implementation Findings (NAACL) 2022 Jun Zhu, Céline Hudelot

Works on learning job title representation are mainly based on \textit{Job-Transition Graph}, built from the working history of talents.

Representation Learning TAG

Revisiting Instruction Fine-tuned Model Evaluation to Guide Industrial Applications

1 code implementation21 Oct 2023 Manuel Faysse, Gautier Viaud, Céline Hudelot, Pierre Colombo

Instruction Fine-Tuning (IFT) is a powerful paradigm that strengthens the zero-shot capabilities of Large Language Models (LLMs), but in doing so induces new evaluation metric requirements.

Towards Cross-Tokenizer Distillation: the Universal Logit Distillation Loss for LLMs

1 code implementation19 Feb 2024 Nicolas Boizard, Kevin El Haddad, Céline Hudelot, Pierre Colombo

Deploying large language models (LLMs) of several billion parameters can be impractical in most industrial use cases due to constraints such as cost, latency limitations, and hardware accessibility.

Knowledge Distillation

CroissantLLM: A Truly Bilingual French-English Language Model

1 code implementation1 Feb 2024 Manuel Faysse, Patrick Fernandes, Nuno M. Guerreiro, António Loison, Duarte M. Alves, Caio Corro, Nicolas Boizard, João Alves, Ricardo Rei, Pedro H. Martins, Antoni Bigata Casademunt, François Yvon, André F. T. Martins, Gautier Viaud, Céline Hudelot, Pierre Colombo

We introduce CroissantLLM, a 1. 3B language model pretrained on a set of 3T English and French tokens, to bring to the research and industrial community a high-performance, fully open-sourced bilingual model that runs swiftly on consumer-grade local hardware.

Language Modelling Large Language Model

Belief Revision, Minimal Change and Relaxation: A General Framework based on Satisfaction Systems, and Applications to Description Logics

no code implementations8 Feb 2015 Marc Aiguier, Jamal Atif, Isabelle Bloch, Céline Hudelot

Belief revision of knowledge bases represented by a set of sentences in a given logic has been extensively studied but for specific logics, mainly propositional, and also recently Horn and description logics.

Relaxation-based revision operators in description logics

no code implementations26 Feb 2015 Marc Aiguier, Jamal Atif, Isabelle Bloch, Céline Hudelot

In this paper we address both the generalization of the well-known AGM postulates, and the definition of concrete and well-founded revision operators in different DL families.

Negation

Learning Finer-class Networks for Universal Representations

no code implementations4 Oct 2018 Julien Girard, Youssef Tamaazousti, Hervé Le Borgne, Céline Hudelot

This raises the question of how well the original representation is "universal", that is to say directly adapted to many different target-tasks.

Learning Invariant Representations for Sentiment Analysis: The Missing Material is Datasets

no code implementations29 Jul 2019 Victor Bouvier, Philippe Very, Céline Hudelot, Clément Chastagnol

Learning representations which remain invariant to a nuisance factor has a great interest in Domain Adaptation, Transfer Learning, and Fair Machine Learning.

Domain Adaptation Sentiment Analysis +3

Hidden Covariate Shift: A Minimal Assumption For Domain Adaptation

no code implementations29 Jul 2019 Victor Bouvier, Philippe Very, Céline Hudelot, Clément Chastagnol

Such approach consists in learning a representation of the data such that the label distribution conditioned on this representation is domain invariant.

Unsupervised Domain Adaptation

A New Approach for Explainable Multiple Organ Annotation with Few Data

no code implementations30 Dec 2019 Régis Pierrard, Jean-Philippe Poli, Céline Hudelot

In this paper, we focus on organ annotation in medical images and we introduce a reasoning framework that is based on learning fuzzy relations on a small dataset for generating explanations.

Robust Domain Adaptation: Representations, Weights and Inductive Bias

no code implementations24 Jun 2020 Victor Bouvier, Philippe Very, Clément Chastagnol, Myriam Tami, Céline Hudelot

The emergence of Domain Invariant Representations (IR) has improved drastically the transferability of representations from a labelled source domain to a new and unlabelled target domain.

Inductive Bias Unsupervised Domain Adaptation

Target Consistency for Domain Adaptation: when Robustness meets Transferability

no code implementations25 Jun 2020 Yassine Ouali, Victor Bouvier, Myriam Tami, Céline Hudelot

Learning Invariant Representations has been successfully applied for reconciling a source and a target domain for Unsupervised Domain Adaptation.

Image Classification Unsupervised Domain Adaptation

Stochastic Adversarial Gradient Embedding for Active Domain Adaptation

no code implementations3 Dec 2020 Victor Bouvier, Philippe Very, Clément Chastagnol, Myriam Tami, Céline Hudelot

First, we select for annotation target samples that are likely to improve the representations' transferability by measuring the variation, before and after annotation, of the transferability loss gradient.

Active Learning Unsupervised Domain Adaptation

AVAE: Adversarial Variational Auto Encoder

no code implementations21 Dec 2020 Antoine Plumerault, Hervé Le Borgne, Céline Hudelot

Among the wide variety of image generative models, two models stand out: Variational Auto Encoders (VAE) and Generative Adversarial Networks (GAN).

Leveraging Conditional Generative Models in a General Explanation Framework of Classifier Decisions

no code implementations21 Jun 2021 Martin Charachon, Paul-Henry Cournède, Céline Hudelot, Roberto Ardon

We show that visual explanation can be produced as the difference between two generated images obtained via two specific conditional generative models.

StreaMulT: Streaming Multimodal Transformer for Heterogeneous and Arbitrary Long Sequential Data

no code implementations15 Oct 2021 Victor Pellegrain, Myriam Tami, Michel Batteux, Céline Hudelot

The increasing complexity of Industry 4. 0 systems brings new challenges regarding predictive maintenance tasks such as fault detection and diagnosis.

Fault Detection Multimodal Sentiment Analysis

Domain-Invariant Representations: A Look on Compression and Weights

no code implementations25 Sep 2019 Victor Bouvier, Céline Hudelot, Clément Chastagnol, Philippe Very, Myriam Tami

Second, we show that learning weighted representations plays a key role in relaxing the constraint of invariance and then preserving the risk of compression.

Domain Adaptation

Optimizing Active Learning for Low Annotation Budgets

no code implementations18 Jan 2022 Umang Aggarwal, Adrian Popescu, Céline Hudelot

It consists in learning a model on a small amount of annotated data (annotation budget) and in choosing the best set of points to annotate in order to improve the previous model and gain in generalization.

Active Learning Transfer Learning

A Comparative Study of Calibration Methods for Imbalanced Class Incremental Learning

no code implementations1 Feb 2022 Umang Aggarwal, Adrian Popescu, Eden Belouadah, Céline Hudelot

Since memory is bounded, old classes are learned with fewer images than new classes and an imbalance due to incremental learning is added to the initial dataset imbalance.

Class Incremental Learning Incremental Learning +1

Test-Time Adaptation with Principal Component Analysis

no code implementations13 Sep 2022 Thomas Cordier, Victor Bouvier, Gilles Hénaff, Céline Hudelot

Machine Learning models are prone to fail when test data are different from training data, a situation often encountered in real applications known as distribution shift.

Test-time Adaptation valid

An Analysis of Initial Training Strategies for Exemplar-Free Class-Incremental Learning

no code implementations22 Aug 2023 Grégoire Petit, Michael Soumm, Eva Feillet, Adrian Popescu, Bertrand Delezoide, David Picard, Céline Hudelot

Our main finding is that the initial training strategy is the dominant factor influencing the average incremental accuracy, but that the choice of CIL algorithm is more important in preventing forgetting.

Class Incremental Learning Incremental Learning

Improving Neural-based Classification with Logical Background Knowledge

no code implementations20 Feb 2024 Arthur Ledaguenel, Céline Hudelot, Mostepha Khouadjia

We develop a new multi-scale methodology to evaluate how the benefits of a neurosymbolic technique evolve with the scale of the network.

Classification Multi-Label Classification

Recommendation of data-free class-incremental learning algorithms by simulating future data

no code implementations26 Mar 2024 Eva Feillet, Adrian Popescu, Céline Hudelot

Our method outperforms competitive baselines, and performance is close to that of an oracle choosing the best algorithm in each setting.

Class Incremental Learning Incremental Learning

Complexity of Probabilistic Reasoning for Neurosymbolic Classification Techniques

no code implementations12 Apr 2024 Arthur Ledaguenel, Céline Hudelot, Mostepha Khouadjia

Informed multi-label classification is a sub-field of neurosymbolic AI which studies how to leverage prior knowledge to improve neural classification systems.

Classification Multi-Label Classification

Cannot find the paper you are looking for? You can Submit a new open access paper.