All in the (Exponential) Family: Information Geometry and Thermodynamic Variational Inference

While the Evidence Lower Bound (ELBO) has become a ubiquitous objective for variational inference, the recently proposed Thermodynamic Variational Objective (TVO) leverages thermodynamic integration to provide a tighter and more general family of bounds. In previous work, the tightness of these bounds was not known, grid search was used to choose a `schedule' of intermediate distributions, and model learning suffered with ostensibly tighter bounds. We interpret the geometric mixture curve common to TVO and related path sampling methods using the geometry of exponential families, which allows us to characterize the gap in TVO bounds as a sum of KL divergences along a given path. Further, we propose a principled technique for choosing intermediate distributions using equal spacing in the moment parameters of our exponential family. We demonstrate that this scheduling approach adapts to the shape of the integrand defining the TVO objective and improves overall performance. Additionally, we derive a reparameterized gradient estimator which empirically allows the TVO to benefit from additional, well chosen partitions. Finally, we provide a unified framework for understanding thermodynamic integration and the TVO in terms of Taylor series remainders.

No code implementations yet. Submit your code now


  Add Datasets introduced or used in this paper

Results from the Paper

  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.


No methods listed for this paper. Add relevant methods here