Variational Inference via Resolution of Singularities

29 Sep 2021  ·  Susan Wei ·

Predicated on the premise that neural networks are best viewed as singular statistical models, we set out to propose a new variational approximation for Bayesian neural networks. The approximation relies on a central result from singular learning theory according to which the posterior distribution over the parameters of a singular model, following an algebraic-geometrical transformation known as a desingularization map, is asymptotically a mixture of standard forms. From here we proceed to demonstrate that a generalized gamma mean-field variational family, following desingularization, can recover the leading order term of the model evidence. Affine coupling layers are employed to learn the unknown desingularization map, effectively rendering the proposed methodology a normalizing flow with the generalized gamma as the source distribution.

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods