Search Results for author: Michael U. Gutmann

Found 30 papers, 13 papers with code

Pen and Paper Exercises in Machine Learning

1 code implementation27 Jun 2022 Michael U. Gutmann

This is a collection of (mostly) pen-and-paper exercises in machine learning.

BIG-bench Machine Learning

Robust Optimisation Monte Carlo

2 code implementations1 Apr 2019 Borislav Ikonomov, Michael U. Gutmann

We demonstrate the effectiveness of the proposed Robust OMC on toy examples and tasks in inverse-graphics where we perform Bayesian inference with a complex image renderer.

Bayesian Inference

Implicit Deep Adaptive Design: Policy-Based Experimental Design without Likelihoods

1 code implementation NeurIPS 2021 Desi R. Ivanova, Adam Foster, Steven Kleinegesse, Michael U. Gutmann, Tom Rainforth

We introduce implicit Deep Adaptive Design (iDAD), a new method for performing adaptive experiments in real-time with implicit models.

Experimental Design

Gradient-based Bayesian Experimental Design for Implicit Models using Mutual Information Lower Bounds

1 code implementation10 May 2021 Steven Kleinegesse, Michael U. Gutmann

We introduce a framework for Bayesian experimental design (BED) with implicit models, where the data-generating distribution is intractable but sampling from it is still possible.

Epidemiology Experimental Design

Sequential Bayesian Experimental Design for Implicit Models via Mutual Information

1 code implementation20 Mar 2020 Steven Kleinegesse, Christopher Drovandi, Michael U. Gutmann

We address this gap in the literature by devising a novel sequential design framework for parameter estimation that uses the Mutual Information (MI) between model parameters and simulated data as a utility function to find optimal experimental designs, which has not been done before for implicit models.

Bayesian Optimisation Decision Making +2

Likelihood-free inference by ratio estimation

1 code implementation30 Nov 2016 Owen Thomas, Ritabrata Dutta, Jukka Corander, Samuel Kaski, Michael U. Gutmann

The popular synthetic likelihood approach infers the parameters by modelling summary statistics of the data by a Gaussian probability distribution.

Designing Optimal Behavioral Experiments Using Machine Learning

1 code implementation12 May 2023 Simon Valentin, Steven Kleinegesse, Neil R. Bramley, Peggy Seriès, Michael U. Gutmann, Christopher G. Lucas

As compared to experimental designs commonly used in the literature, we show that our optimal designs more efficiently determine which of a set of models best account for individual human behavior, and more efficiently characterize behavior given a preferred model.

Decision Making Experimental Design

Conditional Sampling of Variational Autoencoders via Iterated Approximate Ancestral Sampling

1 code implementation17 Aug 2023 Vaidotas Simkus, Michael U. Gutmann

Conditional sampling of variational autoencoders (VAEs) is needed in various applications, such as missing data imputation, but is computationally intractable.

Imputation

Variational Gibbs Inference for Statistical Model Estimation from Incomplete Data

1 code implementation NeurIPS 2023 Vaidotas Simkus, Benjamin Rhodes, Michael U. Gutmann

We address this gap by introducing variational Gibbs inference (VGI), a new general-purpose method to estimate the parameters of statistical models from incomplete data.

BIG-bench Machine Learning Normalising Flows +1

Conditional Noise-Contrastive Estimation of Unnormalised Models

no code implementations ICML 2018 Ciwan Ceylan, Michael U. Gutmann

Examples of unnormalised models are Gibbs distributions, Markov random fields, and neural network models in unsupervised deep learning.

Density Estimation Open-Ended Question Answering

Generative Ratio Matching Networks

no code implementations ICLR 2020 Akash Srivastava, Kai Xu, Michael U. Gutmann, Charles Sutton

In this work, we take their insight of using kernels as fixed adversaries further and present a novel method for training deep generative models that does not involve saddlepoint optimization.

VEEGAN: Reducing Mode Collapse in GANs using Implicit Variational Learning

1 code implementation NeurIPS 2017 Akash Srivastava, Lazar Valkov, Chris Russell, Michael U. Gutmann, Charles Sutton

Deep generative models provide powerful tools for distributions over complicated manifolds, such as those of natural images.

Efficient acquisition rules for model-based approximate Bayesian computation

no code implementations3 Apr 2017 Marko Järvenpää, Michael U. Gutmann, Arijus Pleska, Aki Vehtari, Pekka Marttinen

We propose to compute the uncertainty in the ABC posterior density, which is due to a lack of simulations to estimate this quantity accurately, and define a loss function that measures this uncertainty.

Bayesian Inference Bayesian Optimisation +1

Simultaneous Estimation of Non-Gaussian Components and their Correlation Structure

no code implementations18 Jun 2015 Hiroaki Sasaki, Michael U. Gutmann, Hayaru Shouno, Aapo Hyvärinen

The precision matrix of the linear components is assumed to be randomly generated by a higher-order process and explicitly parametrized by a parameter matrix.

Likelihood-free inference via classification

no code implementations18 Jul 2014 Michael U. Gutmann, Ritabrata Dutta, Samuel Kaski, Jukka Corander

Increasingly complex generative models are being used across disciplines as they allow for realistic characterization of data, but a common difficulty with them is the prohibitively large computational cost to evaluate the likelihood function and thus to perform likelihood-based statistical inference.

Bayesian Inference Classification +1

Bayesian Optimization for Likelihood-Free Inference of Simulator-Based Statistical Models

no code implementations14 Jan 2015 Michael U. Gutmann, Jukka Corander

The strategy is implemented using Bayesian optimization and is shown to accelerate the inference through a reduction in the number of required simulations by several orders of magnitude.

Bayesian Optimization

Classification and Bayesian Optimization for Likelihood-Free Inference

no code implementations19 Feb 2015 Michael U. Gutmann, Jukka Corander, Ritabrata Dutta, Samuel Kaski

This approach faces at least two major difficulties: The first difficulty is the choice of the discrepancy measure which is used to judge whether the simulated data resemble the observed data.

Bayesian Optimization Classification +1

Dynamic Likelihood-free Inference via Ratio Estimation (DIRE)

no code implementations23 Oct 2018 Traiko Dinev, Michael U. Gutmann

While several methods for choosing summary statistics have been proposed for ABC, the literature for synthetic likelihood and LFIRE is very thin to date.

Bayesian Inference Time Series +1

Adaptive Gaussian Copula ABC

no code implementations27 Feb 2019 Yanzhi Chen, Michael U. Gutmann

Approximate Bayesian computation (ABC) is a set of techniques for Bayesian inference when the likelihood is intractable but sampling from the model is possible.

Bayesian Inference

Neural Approximate Sufficient Statistics for Likelihood-free Inference

no code implementations ICLR 2021 Yanzhi Chen, Dinghuai Zhang, Michael U. Gutmann, Aaron Courville, Zhanxing Zhu

We consider the fundamental problem of how to automatically construct summary statistics for likelihood-free inference where the evaluation of likelihood function is intractable but sampling / simulating data from the model is possible.

Scaling Densities For Improved Density Ratio Estimation

no code implementations29 Sep 2021 Akash Srivastava, Seungwook Han, Benjamin Rhodes, Kai Xu, Michael U. Gutmann

As such, estimating density ratios accurately using only samples from $p$ and $q$ is of high significance and has led to a flurry of recent work in this direction.

Binary Classification Density Ratio Estimation

Bayesian Optimization with Informative Covariance

no code implementations4 Aug 2022 Afonso Eduardo, Michael U. Gutmann

Bayesian optimization is a methodology for global optimization of unknown and expensive objectives.

Bayesian Optimization Gaussian Processes +1

Estimating the Density Ratio between Distributions with High Discrepancy using Multinomial Logistic Regression

no code implementations1 May 2023 Akash Srivastava, Seungwook Han, Kai Xu, Benjamin Rhodes, Michael U. Gutmann

We show that if these auxiliary densities are constructed such that they overlap with $p$ and $q$, then a multi-class logistic regression allows for estimating $\log p/q$ on the domain of any of the $K+2$ distributions and resolves the distribution shift problems of the current state-of-the-art methods.

Binary Classification Density Ratio Estimation +4

Improving Variational Autoencoder Estimation from Incomplete Data with Mixture Variational Families

no code implementations5 Mar 2024 Vaidotas Simkus, Michael U. Gutmann

The increased complexity may adversely affect the fit of the model due to a mismatch between the variational and model posterior distributions.

Imputation

Cannot find the paper you are looking for? You can Submit a new open access paper.