no code implementations • 4 Aug 2022 • Afonso Eduardo, Michael U. Gutmann
Bayesian Optimization is a methodology for global optimization of unknown and expensive objectives.
1 code implementation • 27 Jun 2022 • Michael U. Gutmann
This is a collection of (mostly) pen-and-paper exercises in machine learning.
no code implementations • 29 Apr 2022 • Michael U. Gutmann, Steven Kleinegesse, Benjamin Rhodes
The likelihood function plays a crucial role in statistical inference and experimental design.
1 code implementation • 25 Nov 2021 • Vaidotas Simkus, Benjamin Rhodes, Michael U. Gutmann
We address this gap by introducing variational Gibbs inference (VGI), a new general-purpose method to estimate the parameters of statistical models from incomplete data.
1 code implementation • NeurIPS 2021 • Desi R. Ivanova, Adam Foster, Steven Kleinegesse, Michael U. Gutmann, Tom Rainforth
We introduce implicit Deep Adaptive Design (iDAD), a new method for performing adaptive experiments in real-time with implicit models.
no code implementations • NeurIPS Workshop AI4Scien 2021 • Simon Valentin, Steven Kleinegesse, Neil R. Bramley, Michael U. Gutmann, Christopher G. Lucas
Bayesian optimal experimental design (BOED) is a methodology to identify experiments that are expected to yield informative data.
no code implementations • 29 Sep 2021 • Akash Srivastava, Seungwook Han, Benjamin Rhodes, Kai Xu, Michael U. Gutmann
As such, estimating density ratios accurately using only samples from $p$ and $q$ is of high significance and has led to a flurry of recent work in this direction.
1 code implementation • 10 May 2021 • Steven Kleinegesse, Michael U. Gutmann
We introduce a framework for Bayesian experimental design (BED) with implicit models, where the data-generating distribution is intractable but sampling from it is still possible.
no code implementations • ICLR 2021 • Yanzhi Chen, Dinghuai Zhang, Michael U. Gutmann, Aaron Courville, Zhanxing Zhu
We consider the fundamental problem of how to automatically construct summary statistics for likelihood-free inference where the evaluation of likelihood function is intractable but sampling / simulating data from the model is possible.
1 code implementation • NeurIPS 2020 • Benjamin Rhodes, Kai Xu, Michael U. Gutmann
Density-ratio estimation via classification is a cornerstone of unsupervised learning.
1 code implementation • 20 Mar 2020 • Steven Kleinegesse, Christopher Drovandi, Michael U. Gutmann
We address this gap in the literature by devising a novel sequential design framework for parameter estimation that uses the Mutual Information (MI) between model parameters and simulated data as a utility function to find optimal experimental designs, which has not been done before for implicit models.
1 code implementation • ICML 2020 • Steven Kleinegesse, Michael U. Gutmann
A fundamental question is how to design the experiments so that the collected data are most useful.
2 code implementations • 1 Apr 2019 • Borislav Ikonomov, Michael U. Gutmann
We demonstrate the effectiveness of the proposed Robust OMC on toy examples and tasks in inverse-graphics where we perform Bayesian inference with a complex image renderer.
no code implementations • 27 Feb 2019 • Yanzhi Chen, Michael U. Gutmann
Approximate Bayesian computation (ABC) is a set of techniques for Bayesian inference when the likelihood is intractable but sampling from the model is possible.
no code implementations • 23 Oct 2018 • Traiko Dinev, Michael U. Gutmann
While several methods for choosing summary statistics have been proposed for ABC, the literature for synthetic likelihood and LFIRE is very thin to date.
no code implementations • ICML 2018 • Ciwan Ceylan, Michael U. Gutmann
Examples of unnormalised models are Gibbs distributions, Markov random fields, and neural network models in unsupervised deep learning.
no code implementations • ICLR 2020 • Akash Srivastava, Kai Xu, Michael U. Gutmann, Charles Sutton
In this work, we take their insight of using kernels as fixed adversaries further and present a novel method for training deep generative models that does not involve saddlepoint optimization.
2 code implementations • 2 Aug 2017 • Jarno Lintusaari, Henri Vuollekoski, Antti Kangasrääsiö, Kusti Skytén, Marko Järvenpää, Pekka Marttinen, Michael U. Gutmann, Aki Vehtari, Jukka Corander, Samuel Kaski
The stand-alone ELFI graph can be used with any of the available inference methods without modifications.
1 code implementation • NeurIPS 2017 • Akash Srivastava, Lazar Valkov, Chris Russell, Michael U. Gutmann, Charles Sutton
Deep generative models provide powerful tools for distributions over complicated manifolds, such as those of natural images.
no code implementations • 3 Apr 2017 • Marko Järvenpää, Michael U. Gutmann, Arijus Pleska, Aki Vehtari, Pekka Marttinen
We propose to compute the uncertainty in the ABC posterior density, which is due to a lack of simulations to estimate this quantity accurately, and define a loss function that measures this uncertainty.
1 code implementation • 30 Nov 2016 • Owen Thomas, Ritabrata Dutta, Jukka Corander, Samuel Kaski, Michael U. Gutmann
The popular synthetic likelihood approach infers the parameters by modelling summary statistics of the data by a Gaussian probability distribution.
no code implementations • 18 Jun 2015 • Hiroaki Sasaki, Michael U. Gutmann, Hayaru Shouno, Aapo Hyvärinen
The precision matrix of the linear components is assumed to be randomly generated by a higher-order process and explicitly parametrized by a parameter matrix.
no code implementations • 19 Feb 2015 • Michael U. Gutmann, Jukka Corander, Ritabrata Dutta, Samuel Kaski
This approach faces at least two major difficulties: The first difficulty is the choice of the discrepancy measure which is used to judge whether the simulated data resemble the observed data.
no code implementations • 14 Jan 2015 • Michael U. Gutmann, Jukka Corander
The strategy is implemented using Bayesian optimization and is shown to accelerate the inference through a reduction in the number of required simulations by several orders of magnitude.
no code implementations • 18 Jul 2014 • Michael U. Gutmann, Ritabrata Dutta, Samuel Kaski, Jukka Corander
Increasingly complex generative models are being used across disciplines as they allow for realistic characterization of data, but a common difficulty with them is the prohibitively large computational cost to evaluate the likelihood function and thus to perform likelihood-based statistical inference.
no code implementations • 25 Apr 2013 • Song Liu, John A. Quinn, Michael U. Gutmann, Taiji Suzuki, Masashi Sugiyama
We propose a new method for detecting changes in Markov network structure between two sets of samples.