no code implementations • 23 Jun 2021 • Prateek Jaiswal, Harsha Honnappa, Vinayak A. Rao
Bayesian posteriors afford a principled mechanism to incorporate data and prior knowledge into stochastic optimization problems.
no code implementations • 13 Jan 2021 • Imon Banerjee, Vinayak A. Rao, Harsha Honnappa
We present a PAC-Bayesian analysis of variational Bayes (VB) approximations to tempered Bayesian posterior distributions, bounding the model risk of the VB approximations.
Statistics Theory Statistics Theory
no code implementations • pproximateinference AABI Symposium 2019 • Prateek Jaiswal, Harsha Honnappa, Vinayak A. Rao
We study system design problems stated as parameterized stochastic programs with a chance-constraint set.
no code implementations • 4 Nov 2019 • Prateek Jaiswal, Harsha Honnappa, Vinayak A. Rao
We also establish the asymptotic consistency of decision rules obtained from a `naive' variational Bayesian procedure.
no code implementations • 5 Feb 2019 • Prateek Jaiswal, Vinayak A. Rao, Harsha Honnappa
We study the asymptotic consistency properties of $\alpha$-R\'enyi approximate posteriors, a class of variational Bayesian methods that approximate an intractable Bayesian posterior with a member of a tractable family of distributions, the member chosen to minimize the $\alpha$-R\'enyi divergence from the true posterior.
no code implementations • NeurIPS 2017 • Boqian Zhang, Jiangwei Pan, Vinayak A. Rao
Markov jump processes are continuous-time stochastic processes widely used in statistical applications in the natural sciences, and more recently in machine learning.