Partial Rejection Control for Robust Variational Inference in Sequential Latent Variable Models
Effective variational inference crucially depends on a flexible variational family of distributions. Recent work has explored sequential Monte-Carlo (SMC) methods to construct variational distributions, which can, in principle, approximate the target posterior arbitrarily well, which is especially appealing for models with inherent sequential structure. However, SMC, which represents the posterior using a weighted set of particles, often suffers from particle weight degeneracy, leading to a large variance of the resulting estimators. To address this issue, we present a novel approach that leverages the idea of \emph{partial} rejection control (PRC) and enables us to develop a robust variational inference framework. Although PRC constructs a low variance estimator of the marginal likelihood, unbiased estimators are not available in the literature for arbitrary variational posteriors. We solve this issue by employing a \emph{dice-enterprise}: a generalization of the \emph{Bernoulli factory} to construct unbiased estimators for SMC-PRC. The resulting variational lower bound can be optimized efficiently with respect to the variational parameters. We show theoretical properties of the lower bound and report experiments on various sequential models, such as the Gaussian state-space model and variational RNN, on which our approach outperforms existing methods.
PDF Abstract