Variational Inference

682 papers with code • 2 benchmarks • 6 datasets

Fitting approximate posteriors with variational inference transforms the inference problem into an optimization problem, where the goal is (typically) to optimize the evidence lower bound (ELBO) on the log likelihood of the data.

Libraries

Use these libraries to find Variational Inference models and implementations

Most implemented papers

Auto-Encoding Variational Bayes

microsoft/recommenders 20 Dec 2013

First, we show that a reparameterization of the variational lower bound yields a lower bound estimator that can be straightforwardly optimized using standard stochastic gradient methods.

Adversarial Autoencoders

eriklindernoren/PyTorch-GAN 18 Nov 2015

In this paper, we propose the "adversarial autoencoder" (AAE), which is a probabilistic autoencoder that uses the recently proposed generative adversarial networks (GAN) to perform variational inference by matching the aggregated posterior of the hidden code vector of the autoencoder with an arbitrary prior distribution.

Variational Inference with Normalizing Flows

clementchadebec/benchmark_VAE 21 May 2015

The choice of approximate posterior distribution is one of the core problems in variational inference.

A Theoretically Grounded Application of Dropout in Recurrent Neural Networks

HKUST-KnowComp/R-Net NeurIPS 2016

Recent results at the intersection of Bayesian modelling and deep learning offer a Bayesian interpretation of common deep learning techniques such as dropout.

PointFlow: 3D Point Cloud Generation with Continuous Normalizing Flows

stevenygd/PointFlow ICCV 2019

Specifically, we learn a two-level hierarchy of distributions where the first level is the distribution of shapes and the second level is the distribution of points given a shape.

Stein Variational Gradient Descent: A General Purpose Bayesian Inference Algorithm

DartML/Stein-Variational-Gradient-Descent NeurIPS 2016

We propose a general purpose variational inference algorithm that forms a natural counterpart of gradient descent for optimization.

Topic Modeling in Embedding Spaces

adjidieng/ETM TACL 2020

To this end, we develop the Embedded Topic Model (ETM), a generative model of documents that marries traditional topic models with word embeddings.

Gaussian Processes for Big Data

cornellius-gp/gpytorch 26 Sep 2013

We introduce stochastic variational inference for Gaussian process models.

Improving Variational Inference with Inverse Autoregressive Flow

openai/iaf 15 Jun 2016

The framework of normalizing flows provides a general strategy for flexible variational inference of posteriors over latent variables.

Doubly Stochastic Variational Inference for Deep Gaussian Processes

ICL-SML/Doubly-Stochastic-DGP NeurIPS 2017

Existing approaches to inference in DGP models assume approximate posteriors that force independence between the layers, and do not work well in practice.