no code implementations • 10 Mar 2021 • Benjamin Aubin
The main difficulty that arises in the analysis of most machine learning algorithms is to handle, analytically and numerically, a large number of interacting random variables.
2 code implementations • 22 Feb 2021 • Benjamin Aubin, Agnieszka Słowik, Martin Arjovsky, Leon Bottou, David Lopez-Paz
There is an increasing interest in algorithms to learn invariant correlations across training environments.
no code implementations • NeurIPS 2020 • Benjamin Aubin, Florent Krzakala, Yue M. Lu, Lenka Zdeborová
We consider a commonly studied supervised classification of a synthetic dataset whose labels are generated by feeding a one-layer neural network with random iid inputs.
1 code implementation • 3 Apr 2020 • Antoine Baker, Benjamin Aubin, Florent Krzakala, Lenka Zdeborová
We introduce Tree-AMP, standing for Tree Approximate Message Passing, a python package for compositional inference in high-dimensional tree-structured models.
no code implementations • 5 Dec 2019 • Alia Abbara, Benjamin Aubin, Florent Krzakala, Lenka Zdeborová
Statistical learning theory provides bounds of the generalization gap, using in particular the Vapnik-Chervonenkis dimension and the Rademacher complexity.
no code implementations • 4 Dec 2019 • Benjamin Aubin, Bruno Loureiro, Antoine Baker, Florent Krzakala, Lenka Zdeborová
We consider the problem of compressed sensing and of (real-valued) phase retrieval with random measurement matrix.
no code implementations • NeurIPS Workshop Deep_Invers 2019 • Benjamin Aubin, Bruno Loureiro, Antoine Baker, Florent Krzakala, Lenka Zdeborova
We consider the problem of compressed sensing and of (real-valued) phase retrieval with random measurement matrix.
2 code implementations • NeurIPS 2019 • Benjamin Aubin, Bruno Loureiro, Antoine Maillard, Florent Krzakala, Lenka Zdeborová
Here, we replace the sparsity assumption by generative modelling, and investigate the consequences on statistical and algorithmic properties.
1 code implementation • NeurIPS 2018 • Benjamin Aubin, Antoine Maillard, Jean Barbier, Florent Krzakala, Nicolas Macris, Lenka Zdeborová
Heuristic tools from statistical physics have been used in the past to locate the phase transitions and compute the optimal learning and generalization errors in the teacher-student scenario in multi-layer neural networks.