Search Results for author: Marylou Gabrié

Found 15 papers, 6 papers with code

Stochastic Localization via Iterative Posterior Sampling

1 code implementation16 Feb 2024 Louis Grenioux, Maxence Noble, Marylou Gabrié, Alain Oliviero Durmus

Building upon score-based learning, new interest in stochastic localization techniques has recently emerged.

Denoising

Active learning of Boltzmann samplers and potential energies with quantum mechanical accuracy

no code implementations29 Jan 2024 Ana Molina-Taborda, Pilar Cossio, Olga Lopez-Acevedo, Marylou Gabrié

Extracting consistent statistics between relevant free-energy minima of a molecular system is essential for physics, chemistry and biology.

Active Learning

Balanced Training of Energy-Based Models with Adaptive Flow Sampling

no code implementations1 Jun 2023 Louis Grenioux, Éric Moulines, Marylou Gabrié

Energy-based models (EBMs) are versatile density estimation models that directly parameterize an unnormalized log density.

Density Estimation Variational Inference

On Sampling with Approximate Transport Maps

1 code implementation9 Feb 2023 Louis Grenioux, Alain Durmus, Éric Moulines, Marylou Gabrié

Transport maps can ease the sampling of distributions with non-trivial geometries by transforming them into distributions that are easier to handle.

Local-Global MCMC kernels: the best of both worlds

1 code implementation4 Nov 2021 Sergey Samsonov, Evgeny Lagutin, Marylou Gabrié, Alain Durmus, Alexey Naumov, Eric Moulines

Recent works leveraging learning to enhance sampling have shown promising results, in particular by designing effective non-local moves and global proposals.

Efficient Bayesian Sampling Using Normalizing Flows to Assist Markov Chain Monte Carlo Methods

no code implementations ICML Workshop INNF 2021 Marylou Gabrié, Grant M. Rotskoff, Eric Vanden-Eijnden

Normalizing flows can generate complex target distributions and thus show promise in many applications in Bayesian statistics as an alternative or complement to MCMC for sampling posteriors.

Dual Training of Energy-Based Models with Overparametrized Shallow Neural Networks

no code implementations11 Jul 2021 Carles Domingo-Enrich, Alberto Bietti, Marylou Gabrié, Joan Bruna, Eric Vanden-Eijnden

In the feature-learning regime, this dual formulation justifies using a two time-scale gradient ascent-descent (GDA) training algorithm in which one updates concurrently the particles in the sample space and the neurons in the parameter space of the energy.

On the interplay between data structure and loss function in classification problems

1 code implementation NeurIPS 2021 Stéphane d'Ascoli, Marylou Gabrié, Levent Sagun, Giulio Biroli

One of the central puzzles in modern machine learning is the ability of heavily overparametrized models to generalize well.

valid

Practical Phase Retrieval: Low-Photon Holography with Untrained Priors

no code implementations1 Jan 2021 Hannah Lawrence, David Barmherzig, Henry Li, Michael Eickenberg, Marylou Gabrié

To the best of our knowledge, this is the first work to consider a dataset-free machine learning approach for holographic phase retrieval.

Retrieval

Phase Retrieval with Holography and Untrained Priors: Tackling the Challenges of Low-Photon Nanoscale Imaging

1 code implementation14 Dec 2020 Hannah Lawrence, David A. Barmherzig, Henry Li, Michael Eickenberg, Marylou Gabrié

Phase retrieval is the inverse problem of recovering a signal from magnitude-only Fourier measurements, and underlies numerous imaging modalities, such as Coherent Diffraction Imaging (CDI).

Retrieval

Mean-field inference methods for neural networks

no code implementations3 Nov 2019 Marylou Gabrié

We also provide references for past and current directions of research on neural networks relying on mean-field methods.

A Deterministic and Generalized Framework for Unsupervised Learning with Restricted Boltzmann Machines

no code implementations10 Feb 2017 Eric W. Tramel, Marylou Gabrié, Andre Manoel, Francesco Caltagirone, Florent Krzakala

Restricted Boltzmann machines (RBMs) are energy-based neural-networks which are commonly used as the building blocks for deep architectures neural architectures.

Denoising

Inferring Sparsity: Compressed Sensing using Generalized Restricted Boltzmann Machines

no code implementations13 Jun 2016 Eric W. Tramel, Andre Manoel, Francesco Caltagirone, Marylou Gabrié, Florent Krzakala

In this work, we consider compressed sensing reconstruction from $M$ measurements of $K$-sparse structured signals which do not possess a writable correlation model.

Training Restricted Boltzmann Machines via the Thouless-Anderson-Palmer Free Energy

no code implementations9 Jun 2015 Marylou Gabrié, Eric W. Tramel, Florent Krzakala

Restricted Boltzmann machines are undirected neural networks which have been shown to be effective in many applications, including serving as initializations for training deep multi-layer neural networks.

Cannot find the paper you are looking for? You can Submit a new open access paper.