Statistical Inference in Mean-Field Variational Bayes

4 Nov 2019  ·  Wei Han, Yun Yang ·

We conduct non-asymptotic analysis on the mean-field variational inference for approximating posterior distributions in complex Bayesian models that may involve latent variables. We show that the mean-field approximation to the posterior can be well-approximated relative to the Kullback-Leibler divergence discrepancy measure by a normal distribution whose center is the maximum likelihood estimator (MLE). In particular, our results imply that the center of the mean-field approximation matches the MLE up to higher-order terms and there is essentially no loss of efficiency in using it as a point estimator for the parameter in any regular parametric model with latent variables. We also propose a new class of variational weighted likelihood bootstrap (VWLB) methods for quantifying the uncertainty in the mean-field variational inference. The proposed VWLB can be viewed as a new sampling scheme that produces independent samples for approximating the posterior. Comparing with traditional sampling algorithms such Markov Chain Monte Carlo, VWLB can be implemented in parallel and is free of tuning.

PDF Abstract
No code implementations yet. Submit your code now

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here