Search Results for author: Iain Murray

Found 37 papers, 21 papers with code

Regularising Fisher Information Improves Cross-lingual Generalisation

no code implementations EMNLP (MRL) 2021 Asa Cooper Stickland, Iain Murray

Many recent works use ‘consistency regularisation’ to improve the generalisation of fine-tuned pre-trained models, both multilingual and English-only.

Maximum Likelihood Training of Score-Based Diffusion Models

2 code implementations NeurIPS 2021 Yang song, Conor Durkan, Iain Murray, Stefano Ermon

Score-based diffusion models synthesize samples by reversing a stochastic process that diffuses data to noise, and are trained by minimizing a weighted combination of score matching losses.

Ranked #5 on Image Generation on ImageNet 32x32 (bpd metric)

Data Augmentation Image Generation

Diverse Ensembles Improve Calibration

no code implementations8 Jul 2020 Asa Cooper Stickland, Iain Murray

Modern deep neural networks can produce badly calibrated predictions, especially when train and test distributions are mismatched.

Data Augmentation

Density Deconvolution with Normalizing Flows

1 code implementation16 Jun 2020 Tim Dockhorn, James A. Ritchie, Yao-Liang Yu, Iain Murray

Density deconvolution is the task of estimating a probability density function given only noise-corrupted samples.

Density Estimation Variational Inference

Ordering Dimensions with Nested Dropout Normalizing Flows

1 code implementation15 Jun 2020 Artur Bekasov, Iain Murray

Like in PCA, the leading latent dimensions define a sequence of manifolds that lie close to the data.

On Contrastive Learning for Likelihood-free Inference

5 code implementations ICML 2020 Conor Durkan, Iain Murray, George Papamakarios

Likelihood-free methods perform parameter inference in stochastic simulator models where evaluating the likelihood is intractable but sampling synthetic data is possible.

Contrastive Learning

Scalable Extreme Deconvolution

1 code implementation26 Nov 2019 James A. Ritchie, Iain Murray

The Extreme Deconvolution method fits a probability density to a dataset where each observation has Gaussian noise added with a known sample-specific covariance, originally intended for use with astronomical datasets.

CloudLSTM: A Recurrent Neural Model for Spatiotemporal Point-cloud Stream Forecasting

no code implementations29 Jul 2019 Chaoyun Zhang, Marco Fiore, Iain Murray, Paul Patras

This paper introduces CloudLSTM, a new branch of recurrent neural models tailored to forecasting over data streams generated by geospatial point-cloud sources.

Neural Spline Flows

7 code implementations NeurIPS 2019 Conor Durkan, Artur Bekasov, Iain Murray, George Papamakarios

A normalizing flow models a complex probability density as an invertible transformation of a simple base density.

Density Estimation Variational Inference

Cubic-Spline Flows

no code implementations5 Jun 2019 Conor Durkan, Artur Bekasov, Iain Murray, George Papamakarios

A normalizing flow models a complex probability density as an invertible transformation of a simple density.

Density Estimation

Dynamic Evaluation of Transformer Language Models

1 code implementation17 Apr 2019 Ben Krause, Emmanuel Kahembwe, Iain Murray, Steve Renals

This research note combines two methods that have recently improved the state of the art in language modeling: Transformers and dynamic evaluation.

Language Modelling

Bayesian Adversarial Spheres: Bayesian Inference and Adversarial Examples in a Noiseless Setting

no code implementations29 Nov 2018 Artur Bekasov, Iain Murray

Modern deep neural network models suffer from adversarial examples, i. e. confidently misclassified points in the input space.

Bayesian Inference

Sequential Neural Methods for Likelihood-free Inference

no code implementations21 Nov 2018 Conor Durkan, George Papamakarios, Iain Murray

Likelihood-free inference refers to inference when a likelihood function cannot be explicitly evaluated, which is often the case for models based on simulators.

Mode Normalization

2 code implementations ICLR 2019 Lucas Deecke, Iain Murray, Hakan Bilen

Normalization methods are a central building block in the deep learning toolbox.

Sequential Neural Likelihood: Fast Likelihood-free Inference with Autoregressive Flows

10 code implementations18 May 2018 George Papamakarios, David C. Sterratt, Iain Murray

We present Sequential Neural Likelihood (SNL), a new method for Bayesian inference in simulator models, where the likelihood is intractable but simulating data from the model is possible.

Bayesian Inference

Model Criticism in Latent Space

1 code implementation13 Nov 2017 Sohan Seth, Iain Murray, Christopher K. I. Williams

Model criticism is usually carried out by assessing if replicated data generated under the fitted model looks similar to the observed data, see e. g. Gelman, Carlin, Stern, and Rubin [2004, p. 165].

Gaussian Processes

Masked Autoregressive Flow for Density Estimation

13 code implementations NeurIPS 2017 George Papamakarios, Theo Pavlakou, Iain Murray

By constructing a stack of autoregressive models, each modelling the random numbers of the next model in the stack, we obtain a type of normalizing flow suitable for density estimation, which we call Masked Autoregressive Flow.

Density Estimation

Aye or naw, whit dae ye hink? Scottish independence and linguistic identity on social media

no code implementations EACL 2017 Philippa Shoemark, Debnil Sur, Luke Shrimpton, Iain Murray, Sharon Goldwater

Political surveys have indicated a relationship between a sense of Scottish identity and voting decisions in the 2014 Scottish Independence Referendum.

Fast ε-free Inference of Simulation Models with Bayesian Conditional Density Estimation

1 code implementation NeurIPS 2016 George Papamakarios, Iain Murray

In some cases, learning an accurate parametric representation of the entire true posterior distribution requires fewer model simulations than Monte Carlo ABC methods need to produce a single sample from an approximate posterior.

Density Estimation

Markov Chain Truncation for Doubly-Intractable Inference

no code implementations15 Oct 2016 Colin Wei, Iain Murray

Computing partition functions, the normalizing constants of probability distributions, is often hard.

Multiplicative LSTM for sequence modelling

1 code implementation26 Sep 2016 Ben Krause, Liang Lu, Iain Murray, Steve Renals

We introduce multiplicative LSTM (mLSTM), a recurrent neural network architecture for sequence modelling that combines the long short-term memory (LSTM) and multiplicative recurrent neural network architectures.

Density Estimation Language Modelling

Fast $ε$-free Inference of Simulation Models with Bayesian Conditional Density Estimation

1 code implementation20 May 2016 George Papamakarios, Iain Murray

In some cases, learning an accurate parametric representation of the entire true posterior distribution requires fewer model simulations than Monte Carlo ABC methods need to produce a single sample from an approximate posterior.

Density Estimation

Neural Autoregressive Distribution Estimation

3 code implementations7 May 2016 Benigno Uria, Marc-Alexandre Côté, Karol Gregor, Iain Murray, Hugo Larochelle

We present Neural Autoregressive Distribution Estimation (NADE) models, which are neural network architectures applied to the problem of unsupervised distribution and density estimation.

Density Estimation Image Generation

Differentiation of the Cholesky decomposition

4 code implementations24 Feb 2016 Iain Murray

We review strategies for differentiating matrix-based computations, and derive symbolic and algorithmic update rules for differentiating expressions containing the Cholesky decomposition.

Computation Mathematical Software

MADE: Masked Autoencoder for Distribution Estimation

15 code implementations12 Feb 2015 Mathieu Germain, Karol Gregor, Iain Murray, Hugo Larochelle

There has been a lot of recent interest in designing neural network models to estimate a distribution from a set of examples.

Density Estimation Image Generation

Incorporating Side Information in Probabilistic Matrix Factorization with Gaussian Processes

no code implementations9 Aug 2014 Ryan Prescott Adams, George E. Dahl, Iain Murray

Probabilistic matrix factorization (PMF) is a powerful method for modeling data associ- ated with pairwise relationships, Finding use in collaborative Filtering, computational bi- ology, and document analysis, among other areas.

Collaborative Filtering Gaussian Processes

A Deep and Tractable Density Estimator

no code implementations7 Oct 2013 Benigno Uria, Iain Murray, Hugo Larochelle

We can thus use the most convenient model for each inference task at hand, and ensembles of such models with different orderings are immediately available.

Density Estimation Image Generation

Parallel MCMC with Generalized Elliptical Slice Sampling

no code implementations28 Oct 2012 Robert Nishihara, Iain Murray, Ryan P. Adams

Probabilistic models are conceptually powerful tools for finding structure in data, but their practical effectiveness is often limited by our ability to perform inference in them.

How biased are maximum entropy models?

no code implementations NeurIPS 2011 Jakob H. Macke, Iain Murray, Peter E. Latham

However, maximum entropy models fit to small data sets can be subject to sampling bias; i. e. the true entropy of the data can be severely underestimated.

Small Data Image Classification

Slice sampling covariance hyperparameters of latent Gaussian models

no code implementations NeurIPS 2010 Iain Murray, Ryan P. Adams

The Gaussian process (GP) is a popular way to specify dependencies between random variables in a probabilistic model.

Elliptical slice sampling

1 code implementation31 Dec 2009 Iain Murray, Ryan Prescott Adams, David J. C. MacKay

Many probabilistic models introduce strong dependencies between variables using a latent multivariate Gaussian distribution or a Gaussian process.

The Gaussian Process Density Sampler

no code implementations NeurIPS 2008 Iain Murray, David Mackay, Ryan P. Adams

Samples drawn from the GPDS are consistent with exact, independent samples from a fixed density function that is a transformation of a function drawn from a Gaussian process prior.

Density Estimation

Evaluating probabilities under high-dimensional latent variable models

no code implementations NeurIPS 2008 Iain Murray, Ruslan R. Salakhutdinov

We present a simple new Monte Carlo algorithm for evaluating probabilities of observations in complex latent variable models, such as Deep Belief Networks.

Cannot find the paper you are looking for? You can Submit a new open access paper.