Search Results for author: Florian Bordes

Found 15 papers, 7 papers with code

Feedback-guided Data Synthesis for Imbalanced Classification

no code implementations29 Sep 2023 Reyhane Askari Hemmat, Mohammad Pezeshki, Florian Bordes, Michal Drozdzal, Adriana Romero-Soriano

In this work, we introduce a framework for augmenting static datasets with useful synthetic samples, which leverages one-shot feedback from the classifier to drive the sampling of the generative model.

Classification imbalanced classification

PUG: Photorealistic and Semantically Controllable Synthetic Data for Representation Learning

1 code implementation NeurIPS 2023 Florian Bordes, Shashank Shekhar, Mark Ibrahim, Diane Bouchacourt, Pascal Vincent, Ari S. Morcos

Synthetic image datasets offer unmatched advantages for designing and evaluating deep neural networks: they make it possible to (i) render as many data samples as needed, (ii) precisely control each scene and yield granular ground truth labels (and captions), (iii) precisely control distribution shifts between training and testing to isolate variables of interest for sound experimentation.

Representation Learning

Do SSL Models Have Déjà Vu? A Case of Unintended Memorization in Self-supervised Learning

1 code implementation NeurIPS 2023 Casey Meehan, Florian Bordes, Pascal Vincent, Kamalika Chaudhuri, Chuan Guo

Self-supervised learning (SSL) algorithms can produce useful image representations by learning to associate different parts of natural images with one another.

Memorization Self-Supervised Learning

Objectives Matter: Understanding the Impact of Self-Supervised Objectives on Vision Transformer Representations

no code implementations25 Apr 2023 Shashank Shekhar, Florian Bordes, Pascal Vincent, Ari Morcos

Here, we aim to explain these differences by analyzing the impact of these objectives on the structure and transferability of the learned representations.

Self-Supervised Learning Specificity

Towards Democratizing Joint-Embedding Self-Supervised Learning

1 code implementation3 Mar 2023 Florian Bordes, Randall Balestriero, Pascal Vincent

Joint Embedding Self-Supervised Learning (JE-SSL) has seen rapid developments in recent years, due to its promise to effectively leverage large unlabeled data.

Data Augmentation Misconceptions +1

The Hidden Uniform Cluster Prior in Self-Supervised Learning

no code implementations13 Oct 2022 Mahmoud Assran, Randall Balestriero, Quentin Duval, Florian Bordes, Ishan Misra, Piotr Bojanowski, Pascal Vincent, Michael Rabbat, Nicolas Ballas

A successful paradigm in representation learning is to perform self-supervised pretraining using tasks based on mini-batch statistics (e. g., SimCLR, VICReg, SwAV, MSN).

Clustering Representation Learning +1

Guillotine Regularization: Why removing layers is needed to improve generalization in Self-Supervised Learning

no code implementations27 Jun 2022 Florian Bordes, Randall Balestriero, Quentin Garrido, Adrien Bardes, Pascal Vincent

This is a little vexing, as one would hope that the network layer at which invariance is explicitly enforced by the SSL criterion during training (the last projector layer) should be the one to use for best generalization performance downstream.

Self-Supervised Learning Transfer Learning

Evaluation of generative networks through their data augmentation capacity

no code implementations ICLR 2018 Timothée Lesort, Florian Bordes, Jean-Francois Goudou, David Filliat

This mixture of real and generated data is thus used to train a classifier which is afterwards tested on a given labeled test dataset.

Data Augmentation

Learning to Generate Samples from Noise through Infusion Training

1 code implementation20 Mar 2017 Florian Bordes, Sina Honari, Pascal Vincent

In this work, we investigate a novel training procedure to learn a generative model as the transition operator of a Markov chain, such that, when applied repeatedly on an unstructured random noise sample, it will denoise it into a sample that matches the target distribution from the training set.

Denoising

Cannot find the paper you are looking for? You can Submit a new open access paper.