no code implementations • 28 Feb 2024 • Giulio Biroli, Tony Bonnaire, Valentin De Bortoli, Marc Mézard
Using statistical physics methods, we study generative diffusion models in the regime where the dimension of space and the number of data are large, and the score function has been trained optimally.
no code implementations • 13 Feb 2024 • Valentin De Bortoli, Michael Hutchinson, Peter Wirnsberger, Arnaud Doucet
Denoising Score Matching estimates the score of a noised version of a target distribution by minimizing a regression loss and is widely used to train the popular class of Denoising Diffusion Models.
1 code implementation • 9 Feb 2024 • Angus Phillips, Hai-Dang Dau, Michael John Hutchinson, Valentin De Bortoli, George Deligiannidis, Arnaud Doucet
Denoising diffusion models have become ubiquitous for generative modeling.
no code implementations • 8 Feb 2024 • Pierre Marion, Anna Korba, Peter Bartlett, Mathieu Blondel, Valentin De Bortoli, Arnaud Doucet, Felipe Llinares-López, Courtney Paquette, Quentin Berthet
We present a new algorithm to optimize distributions defined implicitly by parameterized stochastic diffusions.
no code implementations • 12 Nov 2023 • Valentin De Bortoli, Guan-Horng Liu, Tianrong Chen, Evangelos A. Theodorou, Weilie Nie
In this paper, we highlight that while flow and bridge matching processes preserve the information of the marginal distributions, they do \emph{not} necessarily preserve the coupling information unless additional, stronger optimality conditions are met.
1 code implementation • 19 Oct 2023 • Gabriele Corso, Yilun Xu, Valentin De Bortoli, Regina Barzilay, Tommi Jaakkola
In light of the widespread success of generative models, a significant amount of research has gone into speeding up their sampling time.
1 code implementation • 5 Oct 2023 • Marien Renaud, Jiaming Liu, Valentin De Bortoli, Andrés Almansa, Ulugbek S. Kamilov
Posterior sampling has been shown to be a powerful Bayesian approach for solving imaging inverse problems.
no code implementations • 7 Aug 2023 • Joe Benton, Valentin De Bortoli, Arnaud Doucet, George Deligiannidis
We provide the first convergence bounds which are linear in the data dimension (up to logarithmic factors) assuming only finite second moments of the data distribution.
1 code implementation • 15 Jun 2023 • Matteo Pariset, Ya-Ping Hsieh, Charlotte Bunne, Andreas Krause, Valentin De Bortoli
Schr\"odinger bridges (SBs) provide an elegant framework for modeling the temporal evolution of populations in physical, chemical, or biological systems.
1 code implementation • NeurIPS 2023 • Maxence Noble, Valentin De Bortoli, Arnaud Doucet, Alain Durmus
In this paper, we consider an entropic version of mOT with a tree-structured quadratic cost, i. e., a function that can be written as a sum of pairwise cost functions between the nodes of a tree.
1 code implementation • 11 Apr 2023 • Nic Fishman, Leo Klarner, Valentin De Bortoli, Emile Mathieu, Michael Hutchinson
Denoising diffusion models are a novel class of generative algorithms that achieve state-of-the-art performance across a range of domains, including image generation and text-to-image tasks.
no code implementations • NeurIPS 2023 • Yuyang Shi, Valentin De Bortoli, Andrew Campbell, Arnaud Doucet
However, while it is desirable in many applications to approximate the deterministic dynamic Optimal Transport (OT) map which admits attractive properties, DDMs and FMMs are not guaranteed to provide transports close to the OT map.
1 code implementation • 5 Feb 2023 • Jason Yim, Brian L. Trippe, Valentin De Bortoli, Emile Mathieu, Arnaud Doucet, Regina Barzilay, Tommi Jaakkola
The design of novel protein structures remains a challenge in protein engineering for applications across biomedicine and chemistry.
1 code implementation • 7 Nov 2022 • Joe Benton, Yuyang Shi, Valentin De Bortoli, George Deligiannidis, Arnaud Doucet
We propose a unifying framework generalising this approach to a wide class of spaces and leading to an original extension of score matching.
1 code implementation • NeurIPS 2023 • Maxence Noble, Valentin De Bortoli, Alain Durmus
In this paper, we propose Barrier Hamiltonian Monte Carlo (BHMC), a version of the HMC algorithm which aims at sampling from a Gibbs distribution $\pi$ on a manifold $\mathrm{M}$, endowed with a Hessian metric $\mathfrak{g}$ derived from a self-concordant barrier.
no code implementations • 28 Sep 2022 • Angus Phillips, Thomas Seror, Michael Hutchinson, Valentin De Bortoli, Arnaud Doucet, Emile Mathieu
Score-based generative modelling (SGM) has proven to be a very effective method for modelling densities on finite-dimensional spaces.
no code implementations • 10 Aug 2022 • Valentin De Bortoli
This does not cover settings where the target distribution is supported on a lower-dimensional manifold or is given by some empirical distribution.
no code implementations • 9 Aug 2022 • Florentin Guth, Simon Coste, Valentin De Bortoli, Stephane Mallat
This is because of ill-conditioning properties of the score that we analyze mathematically.
no code implementations • 7 Jul 2022 • James Thornton, Michael Hutchinson, Emile Mathieu, Valentin De Bortoli, Yee Whye Teh, Arnaud Doucet
Our proposed method generalizes Diffusion Schr\"odinger Bridge introduced in \cite{debortoli2021neurips} to the non-Euclidean setting and extends Riemannian score-based models beyond the first time reversal.
1 code implementation • 29 Jun 2022 • Antoine Salmona, Valentin De Bortoli, Julie Delon, Agnès Desolneux
More precisely, we show that the total variation distance and the Kullback-Leibler divergence between the generated and the data distribution are bounded from below by a constant depending on the mode separation and the Lipschitz constant.
1 code implementation • 30 May 2022 • Andrew Campbell, Joe Benton, Valentin De Bortoli, Tom Rainforth, George Deligiannidis, Arnaud Doucet
We provide the first complete continuous time framework for denoising diffusion models of discrete data.
1 code implementation • 27 Feb 2022 • Yuyang Shi, Valentin De Bortoli, George Deligiannidis, Arnaud Doucet
We extend the Schr\"odinger bridge framework to conditional simulation.
2 code implementations • 6 Feb 2022 • Valentin De Bortoli, Emile Mathieu, Michael Hutchinson, James Thornton, Yee Whye Teh, Arnaud Doucet
Score-based generative models (SGMs) are a powerful class of generative models that exhibit remarkable empirical performance.
no code implementations • 16 Jan 2022 • Rémi Laumont, Valentin De Bortoli, Andrés Almansa, Julie Delon, Alain Durmus, Marcelo Pereyra
Bayesian methods to solve imaging inverse problems usually combine an explicit data likelihood function with a prior distribution that explicitly models expected properties of the solution.
1 code implementation • 14 Nov 2021 • Jeremy Heng, Valentin De Bortoli, Arnaud Doucet, James Thornton
This is known to be a challenging problem that has received much attention in the last two decades.
no code implementations • 25 Oct 2021 • Valentin De Bortoli, Agnès Desolneux
Classical results require the invertibility of the Hessian of $U$ in order to establish such asymptotics.
no code implementations • 18 Aug 2021 • George Deligiannidis, Valentin De Bortoli, Arnaud Doucet
We establish the uniform in time stability, w. r. t.
2 code implementations • NeurIPS 2021 • Valentin De Bortoli, James Thornton, Jeremy Heng, Arnaud Doucet
In contrast, solving the Schr\"odinger Bridge problem (SB), i. e. an entropy-regularized optimal transport problem on path spaces, yields diffusions which generate samples from the data distribution in finite time.
no code implementations • 8 Mar 2021 • Rémi Laumont, Valentin De Bortoli, Andrés Almansa, Julie Delon, Alain Durmus, Marcelo Pereyra
The proposed algorithms are demonstrated on several canonical problems such as image deblurring, inpainting, and denoising, where they are used for point estimation as well as for uncertainty visualisation and quantification.
no code implementations • NeurIPS 2020 • Valentin De Bortoli, Alain Durmus, Xavier Fontaine, Umut Simsekli
In comparison to previous works on the subject, we consider settings in which the sequence of stepsizes in SGD can potentially depend on the number of neurons and the iterations.
no code implementations • 8 Apr 2020 • Xavier Fontaine, Valentin De Bortoli, Alain Durmus
This paper proposes a thorough theoretical analysis of Stochastic Gradient Descent (SGD) with non-increasing step sizes.
no code implementations • 3 Dec 2019 • Valentin De Bortoli, Agnes Desolneux, Alain Durmus, Bruno Galerne, Arthur Leclaire
Recent years have seen the rise of convolutional neural network techniques in exemplar-based image synthesis.
1 code implementation • 26 Nov 2019 • Ana F. Vidal, Valentin De Bortoli, Marcelo Pereyra, Alain Durmus
In this work, we propose a general empirical Bayesian method for setting regularisation parameters in imaging problems that are convex w. r. t.
Methodology Computation 62C12, 65C40, 68U10, 62F15, 65J20, 65C60, 65J22
1 code implementation • 28 Oct 2019 • Kimia Nadjahi, Valentin De Bortoli, Alain Durmus, Roland Badeau, Umut Şimşekli
Approximate Bayesian Computation (ABC) is a popular method for approximate inference in generative models with intractable but easy-to-sample likelihood.