no code implementations • 1 May 2023 • Akash Srivastava, Seungwook Han, Kai Xu, Benjamin Rhodes, Michael U. Gutmann
We show that if these auxiliary densities are constructed such that they overlap with $p$ and $q$, then a multi-class logistic regression allows for estimating $\log p/q$ on the domain of any of the $K+2$ distributions and resolves the distribution shift problems of the current state-of-the-art methods.
no code implementations • 29 Jul 2022 • Benjamin Rhodes, Michael Gutmann
The recent introduction of gradient-based MCMC for discrete spaces holds great promise, and comes with the tantalising possibility of new discrete counterparts to celebrated continuous methods such as MALA and HMC.
no code implementations • 29 Apr 2022 • Michael U. Gutmann, Steven Kleinegesse, Benjamin Rhodes
The likelihood function plays a crucial role in statistical inference and experimental design.
1 code implementation • NeurIPS 2023 • Vaidotas Simkus, Benjamin Rhodes, Michael U. Gutmann
We address this gap by introducing variational Gibbs inference (VGI), a new general-purpose method to estimate the parameters of statistical models from incomplete data.
no code implementations • 29 Sep 2021 • Akash Srivastava, Seungwook Han, Benjamin Rhodes, Kai Xu, Michael U. Gutmann
As such, estimating density ratios accurately using only samples from $p$ and $q$ is of high significance and has led to a flurry of recent work in this direction.
1 code implementation • NeurIPS 2020 • Benjamin Rhodes, Kai Xu, Michael U. Gutmann
Density-ratio estimation via classification is a cornerstone of unsupervised learning.
1 code implementation • 18 Oct 2018 • Benjamin Rhodes, Michael Gutmann
The core idea is to use a variational lower bound to the NCE objective function, which can be optimised in the same fashion as the evidence lower bound (ELBO) in standard variational inference (VI).