1 code implementation • NeurIPS 2020 • Michael C. Brennan, Daniele Bigoni, Olivier Zahm, Alessio Spantini, Youssef Marzouk
We prove weak convergence of the generated sequence of distributions to the posterior, and we demonstrate the benefits of the framework on challenging inference problems in machine learning and differential equations, using inverse autoregressive flows and polynomial maps as examples of the underlying density estimators.
no code implementations • 17 Mar 2017 • Alessio Spantini, Daniele Bigoni, Youssef Marzouk
In the context of statistics and machine learning, these transformations can be used to couple a tractable "reference" measure (e. g., a standard Gaussian) with a target measure of interest.