Search Results for author: Michael U. Gutmann

Found 26 papers, 11 papers with code

Bayesian Optimization with Informative Covariance

no code implementations4 Aug 2022 Afonso Eduardo, Michael U. Gutmann

Bayesian Optimization is a methodology for global optimization of unknown and expensive objectives.

Gaussian Processes regression

Pen and Paper Exercises in Machine Learning

1 code implementation27 Jun 2022 Michael U. Gutmann

This is a collection of (mostly) pen-and-paper exercises in machine learning.

BIG-bench Machine Learning

Variational Gibbs inference for statistical model estimation from incomplete data

1 code implementation25 Nov 2021 Vaidotas Simkus, Benjamin Rhodes, Michael U. Gutmann

We address this gap by introducing variational Gibbs inference (VGI), a new general-purpose method to estimate the parameters of statistical models from incomplete data.

BIG-bench Machine Learning Normalising Flows +1

Implicit Deep Adaptive Design: Policy-Based Experimental Design without Likelihoods

1 code implementation NeurIPS 2021 Desi R. Ivanova, Adam Foster, Steven Kleinegesse, Michael U. Gutmann, Tom Rainforth

We introduce implicit Deep Adaptive Design (iDAD), a new method for performing adaptive experiments in real-time with implicit models.

Experimental Design

Scaling Densities For Improved Density Ratio Estimation

no code implementations29 Sep 2021 Akash Srivastava, Seungwook Han, Benjamin Rhodes, Kai Xu, Michael U. Gutmann

As such, estimating density ratios accurately using only samples from $p$ and $q$ is of high significance and has led to a flurry of recent work in this direction.

Density Ratio Estimation

Gradient-based Bayesian Experimental Design for Implicit Models using Mutual Information Lower Bounds

1 code implementation10 May 2021 Steven Kleinegesse, Michael U. Gutmann

We introduce a framework for Bayesian experimental design (BED) with implicit models, where the data-generating distribution is intractable but sampling from it is still possible.

Epidemiology Experimental Design

Neural Approximate Sufficient Statistics for Likelihood-free Inference

no code implementations ICLR 2021 Yanzhi Chen, Dinghuai Zhang, Michael U. Gutmann, Aaron Courville, Zhanxing Zhu

We consider the fundamental problem of how to automatically construct summary statistics for likelihood-free inference where the evaluation of likelihood function is intractable but sampling / simulating data from the model is possible.

Sequential Bayesian Experimental Design for Implicit Models via Mutual Information

1 code implementation20 Mar 2020 Steven Kleinegesse, Christopher Drovandi, Michael U. Gutmann

We address this gap in the literature by devising a novel sequential design framework for parameter estimation that uses the Mutual Information (MI) between model parameters and simulated data as a utility function to find optimal experimental designs, which has not been done before for implicit models.

Bayesian Optimisation Decision Making +2

Robust Optimisation Monte Carlo

2 code implementations1 Apr 2019 Borislav Ikonomov, Michael U. Gutmann

We demonstrate the effectiveness of the proposed Robust OMC on toy examples and tasks in inverse-graphics where we perform Bayesian inference with a complex image renderer.

Bayesian Inference

Adaptive Gaussian Copula ABC

no code implementations27 Feb 2019 Yanzhi Chen, Michael U. Gutmann

Approximate Bayesian computation (ABC) is a set of techniques for Bayesian inference when the likelihood is intractable but sampling from the model is possible.

Bayesian Inference

Dynamic Likelihood-free Inference via Ratio Estimation (DIRE)

no code implementations23 Oct 2018 Traiko Dinev, Michael U. Gutmann

While several methods for choosing summary statistics have been proposed for ABC, the literature for synthetic likelihood and LFIRE is very thin to date.

Bayesian Inference Time Series Analysis

Conditional Noise-Contrastive Estimation of Unnormalised Models

no code implementations ICML 2018 Ciwan Ceylan, Michael U. Gutmann

Examples of unnormalised models are Gibbs distributions, Markov random fields, and neural network models in unsupervised deep learning.

Density Estimation

Generative Ratio Matching Networks

no code implementations ICLR 2020 Akash Srivastava, Kai Xu, Michael U. Gutmann, Charles Sutton

In this work, we take their insight of using kernels as fixed adversaries further and present a novel method for training deep generative models that does not involve saddlepoint optimization.

VEEGAN: Reducing Mode Collapse in GANs using Implicit Variational Learning

1 code implementation NeurIPS 2017 Akash Srivastava, Lazar Valkov, Chris Russell, Michael U. Gutmann, Charles Sutton

Deep generative models provide powerful tools for distributions over complicated manifolds, such as those of natural images.

Efficient acquisition rules for model-based approximate Bayesian computation

no code implementations3 Apr 2017 Marko Järvenpää, Michael U. Gutmann, Arijus Pleska, Aki Vehtari, Pekka Marttinen

We propose to compute the uncertainty in the ABC posterior density, which is due to a lack of simulations to estimate this quantity accurately, and define a loss function that measures this uncertainty.

Bayesian Inference Bayesian Optimisation +1

Likelihood-free inference by ratio estimation

1 code implementation30 Nov 2016 Owen Thomas, Ritabrata Dutta, Jukka Corander, Samuel Kaski, Michael U. Gutmann

The popular synthetic likelihood approach infers the parameters by modelling summary statistics of the data by a Gaussian probability distribution.

Simultaneous Estimation of Non-Gaussian Components and their Correlation Structure

no code implementations18 Jun 2015 Hiroaki Sasaki, Michael U. Gutmann, Hayaru Shouno, Aapo Hyvärinen

The precision matrix of the linear components is assumed to be randomly generated by a higher-order process and explicitly parametrized by a parameter matrix.

Classification and Bayesian Optimization for Likelihood-Free Inference

no code implementations19 Feb 2015 Michael U. Gutmann, Jukka Corander, Ritabrata Dutta, Samuel Kaski

This approach faces at least two major difficulties: The first difficulty is the choice of the discrepancy measure which is used to judge whether the simulated data resemble the observed data.

Classification General Classification

Bayesian Optimization for Likelihood-Free Inference of Simulator-Based Statistical Models

no code implementations14 Jan 2015 Michael U. Gutmann, Jukka Corander

The strategy is implemented using Bayesian optimization and is shown to accelerate the inference through a reduction in the number of required simulations by several orders of magnitude.

Likelihood-free inference via classification

no code implementations18 Jul 2014 Michael U. Gutmann, Ritabrata Dutta, Samuel Kaski, Jukka Corander

Increasingly complex generative models are being used across disciplines as they allow for realistic characterization of data, but a common difficulty with them is the prohibitively large computational cost to evaluate the likelihood function and thus to perform likelihood-based statistical inference.

Bayesian Inference Classification +1

Cannot find the paper you are looking for? You can Submit a new open access paper.