Partial Rejection Control for Robust Variational Inference in Sequential Latent Variable Models

1 Jan 2021  ·  Rahul Sharma, Soumya Banerjee, Dootika Vats, Piyush Rai ·

Effective variational inference crucially depends on a flexible variational family of distributions. Recent work has explored sequential Monte-Carlo (SMC) methods to construct variational distributions, which can, in principle, approximate the target posterior arbitrarily well, which is especially appealing for models with inherent sequential structure. However, SMC, which represents the posterior using a weighted set of particles, often suffers from particle weight degeneracy, leading to a large variance of the resulting estimators. To address this issue, we present a novel approach that leverages the idea of \emph{partial} rejection control (PRC) and enables us to develop a robust variational inference framework. Although PRC constructs a low variance estimator of the marginal likelihood, unbiased estimators are not available in the literature for arbitrary variational posteriors. We solve this issue by employing a \emph{dice-enterprise}: a generalization of the \emph{Bernoulli factory} to construct unbiased estimators for SMC-PRC. The resulting variational lower bound can be optimized efficiently with respect to the variational parameters. We show theoretical properties of the lower bound and report experiments on various sequential models, such as the Gaussian state-space model and variational RNN, on which our approach outperforms existing methods.

PDF Abstract
No code implementations yet. Submit your code now

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods