1 code implementation • 16 Feb 2024 • Louis Grenioux, Maxence Noble, Marylou Gabrié, Alain Oliviero Durmus
Building upon score-based learning, new interest in stochastic localization techniques has recently emerged.
no code implementations • 29 Jan 2024 • Ana Molina-Taborda, Pilar Cossio, Olga Lopez-Acevedo, Marylou Gabrié
Extracting consistent statistics between relevant free-energy minima of a molecular system is essential for physics, chemistry and biology.
no code implementations • 1 Jun 2023 • Louis Grenioux, Éric Moulines, Marylou Gabrié
Energy-based models (EBMs) are versatile density estimation models that directly parameterize an unnormalized log density.
1 code implementation • 9 Feb 2023 • Louis Grenioux, Alain Durmus, Éric Moulines, Marylou Gabrié
Transport maps can ease the sampling of distributions with non-trivial geometries by transforming them into distributions that are easier to handle.
1 code implementation • 4 Nov 2021 • Sergey Samsonov, Evgeny Lagutin, Marylou Gabrié, Alain Durmus, Alexey Naumov, Eric Moulines
Recent works leveraging learning to enhance sampling have shown promising results, in particular by designing effective non-local moves and global proposals.
no code implementations • ICML Workshop INNF 2021 • Marylou Gabrié, Grant M. Rotskoff, Eric Vanden-Eijnden
Normalizing flows can generate complex target distributions and thus show promise in many applications in Bayesian statistics as an alternative or complement to MCMC for sampling posteriors.
no code implementations • 11 Jul 2021 • Carles Domingo-Enrich, Alberto Bietti, Marylou Gabrié, Joan Bruna, Eric Vanden-Eijnden
In the feature-learning regime, this dual formulation justifies using a two time-scale gradient ascent-descent (GDA) training algorithm in which one updates concurrently the particles in the sample space and the neurons in the parameter space of the energy.
1 code implementation • NeurIPS 2021 • Stéphane d'Ascoli, Marylou Gabrié, Levent Sagun, Giulio Biroli
One of the central puzzles in modern machine learning is the ability of heavily overparametrized models to generalize well.
no code implementations • 1 Jan 2021 • Hannah Lawrence, David Barmherzig, Henry Li, Michael Eickenberg, Marylou Gabrié
To the best of our knowledge, this is the first work to consider a dataset-free machine learning approach for holographic phase retrieval.
1 code implementation • 14 Dec 2020 • Hannah Lawrence, David A. Barmherzig, Henry Li, Michael Eickenberg, Marylou Gabrié
Phase retrieval is the inverse problem of recovering a signal from magnitude-only Fourier measurements, and underlies numerous imaging modalities, such as Coherent Diffraction Imaging (CDI).
no code implementations • 3 Nov 2019 • Marylou Gabrié
We also provide references for past and current directions of research on neural networks relying on mean-field methods.
2 code implementations • NeurIPS 2018 • Marylou Gabrié, Andre Manoel, Clément Luneau, Jean Barbier, Nicolas Macris, Florent Krzakala, Lenka Zdeborová
We examine a class of deep learning models with a tractable method to compute information-theoretic quantities.
no code implementations • 10 Feb 2017 • Eric W. Tramel, Marylou Gabrié, Andre Manoel, Francesco Caltagirone, Florent Krzakala
Restricted Boltzmann machines (RBMs) are energy-based neural-networks which are commonly used as the building blocks for deep architectures neural architectures.
no code implementations • 13 Jun 2016 • Eric W. Tramel, Andre Manoel, Francesco Caltagirone, Marylou Gabrié, Florent Krzakala
In this work, we consider compressed sensing reconstruction from $M$ measurements of $K$-sparse structured signals which do not possess a writable correlation model.
no code implementations • 9 Jun 2015 • Marylou Gabrié, Eric W. Tramel, Florent Krzakala
Restricted Boltzmann machines are undirected neural networks which have been shown to be effective in many applications, including serving as initializations for training deep multi-layer neural networks.