# Variational Inference

822 papers with code • 1 benchmarks • 5 datasets

Fitting approximate posteriors with variational inference transforms the inference problem into an optimization problem, where the goal is (typically) to optimize the evidence lower bound (ELBO) on the log likelihood of the data.

## Libraries

Use these libraries to find Variational Inference models and implementations## Most implemented papers

# Auto-Encoding Variational Bayes

First, we show that a reparameterization of the variational lower bound yields a lower bound estimator that can be straightforwardly optimized using standard stochastic gradient methods.

# Adversarial Autoencoders

In this paper, we propose the "adversarial autoencoder" (AAE), which is a probabilistic autoencoder that uses the recently proposed generative adversarial networks (GAN) to perform variational inference by matching the aggregated posterior of the hidden code vector of the autoencoder with an arbitrary prior distribution.

# Variational Inference with Normalizing Flows

The choice of approximate posterior distribution is one of the core problems in variational inference.

# A Theoretically Grounded Application of Dropout in Recurrent Neural Networks

Recent results at the intersection of Bayesian modelling and deep learning offer a Bayesian interpretation of common deep learning techniques such as dropout.

# Stein Variational Gradient Descent: A General Purpose Bayesian Inference Algorithm

We propose a general purpose variational inference algorithm that forms a natural counterpart of gradient descent for optimization.

# PointFlow: 3D Point Cloud Generation with Continuous Normalizing Flows

Specifically, we learn a two-level hierarchy of distributions where the first level is the distribution of shapes and the second level is the distribution of points given a shape.

# Topic Modeling in Embedding Spaces

To this end, we develop the Embedded Topic Model (ETM), a generative model of documents that marries traditional topic models with word embeddings.

# Gaussian Processes for Big Data

We introduce stochastic variational inference for Gaussian process models.

# Learning Latent Dynamics for Planning from Pixels

Planning has been very successful for control tasks with known environment dynamics.

# Improving Variational Inference with Inverse Autoregressive Flow

The framework of normalizing flows provides a general strategy for flexible variational inference of posteriors over latent variables.