Search Results for author: Charles C. Margossian

Found 6 papers, 5 papers with code

An Ordering of Divergences for Variational Inference with Factorized Gaussian Approximations

1 code implementation20 Mar 2024 Charles C. Margossian, Loucas Pillaud-Vivien, Lawrence K. Saul

Our analysis covers the KL divergence, the R\'enyi divergences, and a score-based divergence that compares $\nabla\log p$ and $\nabla\log q$.

valid Variational Inference

Listening to the Noise: Blind Denoising with Gibbs Diffusion

1 code implementation29 Feb 2024 David Heurtel-Depeiges, Charles C. Margossian, Ruben Ohana, Bruno Régaldo-Saint Blancard

Assuming arbitrary parametric Gaussian noise, we develop a Gibbs algorithm that alternates sampling steps from a conditional diffusion model trained to map the signal prior to the family of noise distributions, and a Monte Carlo sampler to infer the noise parameters.

Bayesian Inference Denoising

Batch and match: black-box variational inference with a score-based divergence

no code implementations22 Feb 2024 Diana Cai, Chirag Modi, Loucas Pillaud-Vivien, Charles C. Margossian, Robert M. Gower, David M. Blei, Lawrence K. Saul

We analyze the convergence of BaM when the target distribution is Gaussian, and we prove that in the limit of infinite batch size the variational parameter updates converge exponentially quickly to the target mean and covariance.

Variational Inference

Amortized Variational Inference: When and Why?

2 code implementations20 Jul 2023 Charles C. Margossian, David M. Blei

We then show, on a broader class of models, how to expand the domain of AVI's inference function to improve its solution, and we provide examples, e. g. hidden Markov models, where the amortization gap cannot be closed.

Bayesian Inference Gaussian Processes +1

The Shrinkage-Delinkage Trade-off: An Analysis of Factorized Gaussian Approximations for Variational Inference

1 code implementation17 Feb 2023 Charles C. Margossian, Lawrence K. Saul

We study various manifestations of this trade-off, notably one where, as the dimension of the problem grows, the per-component entropy gap between $p$ and $q$ becomes vanishingly small even though $q$ underestimates every componentwise variance by a constant multiplicative factor.

Variational Inference

A Review of automatic differentiation and its efficient implementation

3 code implementations12 Nov 2018 Charles C. Margossian

Derivatives play a critical role in computational statistics, examples being Bayesian inference using Hamiltonian Monte Carlo sampling and the training of neural networks.

Mathematical Software Computation

Cannot find the paper you are looking for? You can Submit a new open access paper.