Variationally Inferred Sampling Through a Refined Bound

A framework for efficient Bayesian inference in probabilistic programs is introduced by embedding a sampler inside a variational posterior approximation. Its strength lies in both ease of implementation and automatically tuning sampler parameters to speed up mixing time. Several strategies to approximate the evidence lower bound (ELBO) computation are introduced, including a rewriting of the ELBO objective. Experimental evidence is shown by performing experiments on an unconditional VAE on density estimation tasks; solving an influence diagram in a high-dimensional space with a conditional variational autoencoder (cVAE) as a deep Bayes classifier; and state-space models for time-series data.

PDF Abstract

Datasets


Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here