Search Results for author: Benjamin Aubin

Found 10 papers, 5 papers with code

Flash Diffusion: Accelerating Any Conditional Diffusion Model for Few Steps Image Generation

1 code implementation4 Jun 2024 Clement Chadebec, Onur Tasar, Eyal Benaroche, Benjamin Aubin

In this paper, we propose an efficient, fast, and versatile distillation method to accelerate the generation of pre-trained diffusion models: Flash Diffusion.

Face Swapping Image Inpainting +1

Mean-field methods and algorithmic perspectives for high-dimensional machine learning

no code implementations10 Mar 2021 Benjamin Aubin

The main difficulty that arises in the analysis of most machine learning algorithms is to handle, analytically and numerically, a large number of interacting random variables.

BIG-bench Machine Learning Generalization Bounds +1

Linear unit-tests for invariance discovery

2 code implementations22 Feb 2021 Benjamin Aubin, Agnieszka Słowik, Martin Arjovsky, Leon Bottou, David Lopez-Paz

There is an increasing interest in algorithms to learn invariant correlations across training environments.

Out-of-Distribution Generalization

Generalization error in high-dimensional perceptrons: Approaching Bayes error with convex optimization

no code implementations NeurIPS 2020 Benjamin Aubin, Florent Krzakala, Yue M. Lu, Lenka Zdeborová

We consider a commonly studied supervised classification of a synthetic dataset whose labels are generated by feeding a one-layer neural network with random iid inputs.


Tree-AMP: Compositional Inference with Tree Approximate Message Passing

1 code implementation3 Apr 2020 Antoine Baker, Benjamin Aubin, Florent Krzakala, Lenka Zdeborová

We introduce Tree-AMP, standing for Tree Approximate Message Passing, a python package for compositional inference in high-dimensional tree-structured models.

Rademacher complexity and spin glasses: A link between the replica and statistical theories of learning

no code implementations5 Dec 2019 Alia Abbara, Benjamin Aubin, Florent Krzakala, Lenka Zdeborová

Statistical learning theory provides bounds of the generalization gap, using in particular the Vapnik-Chervonenkis dimension and the Rademacher complexity.

Learning Theory

Exact asymptotics for phase retrieval and compressed sensing with random generative priors

no code implementations4 Dec 2019 Benjamin Aubin, Bruno Loureiro, Antoine Baker, Florent Krzakala, Lenka Zdeborová

We consider the problem of compressed sensing and of (real-valued) phase retrieval with random measurement matrix.


The spiked matrix model with generative priors

2 code implementations NeurIPS 2019 Benjamin Aubin, Bruno Loureiro, Antoine Maillard, Florent Krzakala, Lenka Zdeborová

Here, we replace the sparsity assumption by generative modelling, and investigate the consequences on statistical and algorithmic properties.

Dimensionality Reduction

The committee machine: Computational to statistical gaps in learning a two-layers neural network

1 code implementation NeurIPS 2018 Benjamin Aubin, Antoine Maillard, Jean Barbier, Florent Krzakala, Nicolas Macris, Lenka Zdeborová

Heuristic tools from statistical physics have been used in the past to locate the phase transitions and compute the optimal learning and generalization errors in the teacher-student scenario in multi-layer neural networks.

Cannot find the paper you are looking for? You can Submit a new open access paper.