Search Results for author: Frank Wood

Found 76 papers, 27 papers with code

All in the (Exponential) Family: Information Geometry and Thermodynamic Variational Inference

no code implementations ICML 2020 Rob Brekelmans, Vaden Masrani, Frank Wood, Greg Ver Steeg, Aram Galstyan

While the Evidence Lower Bound (ELBO) has become a ubiquitous objective for variational inference, the recently proposed Thermodynamic Variational Objective (TVO) leverages thermodynamic integration to provide a tighter and more general family of bounds.

Variational Inference

q-Paths: Generalizing the Geometric Annealing Path using Power Means

1 code implementation1 Jul 2021 Vaden Masrani, Rob Brekelmans, Thang Bui, Frank Nielsen, Aram Galstyan, Greg Ver Steeg, Frank Wood

Many common machine learning methods involve the geometric annealing path, a sequence of intermediate densities between two distributions of interest constructed using the geometric average.

Bayesian Inference

Differentiable Particle Filtering without Modifying the Forward Pass

no code implementations18 Jun 2021 Adam Ścibior, Vaden Masrani, Frank Wood

In recent years particle filters have being used as components in systems optimized end-to-end with gradient descent.

Image Completion via Inference in Deep Generative Models

no code implementations24 Feb 2021 William Harvey, Saeid Naderiparizi, Frank Wood

We consider image completion from the perspective of amortized inference in an image generative model.

Near-Optimal Glimpse Sequences for Training Hard Attention Neural Networks

no code implementations1 Jan 2021 William Harvey, Michael Teng, Frank Wood

We introduce methodology from the BOED literature to approximate this optimal behaviour, and use it to generate `near-optimal' sequences of attention locations.

General Classification Image Classification

Robust Asymmetric Learning in POMDPs

1 code implementation31 Dec 2020 Andrew Warrington, J. Wilder Lavington, Adam Ścibior, Mark Schmidt, Frank Wood

Policies for partially observed Markov decision processes can be efficiently learned by imitating policies for the corresponding fully observed Markov decision processes.

Imitation Learning

Annealed Importance Sampling with q-Paths

1 code implementation14 Dec 2020 Rob Brekelmans, Vaden Masrani, Thang Bui, Frank Wood, Aram Galstyan, Greg Ver Steeg, Frank Nielsen

Annealed importance sampling (AIS) is the gold standard for estimating partition functions or marginal likelihoods, corresponding to importance sampling over a path of distributions between a tractable base and an unnormalized target.

Ensemble Squared: A Meta AutoML System

no code implementations10 Dec 2020 Jason Yoo, Tony Joseph, Dylan Yung, S. Ali Nasseri, Frank Wood

There are currently many barriers that prevent non-experts from exploiting machine learning solutions ranging from the lack of intuition on statistical learning techniques to the trickiness of hyperparameter tuning.

AutoML

Gaussian Process Bandit Optimization of the Thermodynamic Variational Objective

1 code implementation NeurIPS 2020 Vu Nguyen, Vaden Masrani, Rob Brekelmans, Michael A. Osborne, Frank Wood

Achieving the full promise of the Thermodynamic Variational Objective (TVO), a recently proposed variational lower bound on the log evidence involving a one-dimensional Riemann integral approximation, requires choosing a "schedule" of sorted discretization points.

Uncertainty in Neural Processes

no code implementations8 Oct 2020 Saeid Naderiparizi, Kenny Chiu, Benjamin Bloem-Reddy, Frank Wood

We aim this work to be a counterpoint to a recent trend in the literature that stresses achieving good samples when the amount of conditioning data is large.

Assisting the Adversary to Improve GAN Training

no code implementations3 Oct 2020 Andreas Munk, William Harvey, Frank Wood

Some of the most popular methods for improving the stability and performance of GANs involve constraining or regularizing the discriminator.

All in the Exponential Family: Bregman Duality in Thermodynamic Variational Inference

1 code implementation1 Jul 2020 Rob Brekelmans, Vaden Masrani, Frank Wood, Greg Ver Steeg, Aram Galstyan

We propose to choose intermediate distributions using equal spacing in the moment parameters of our exponential family, which matches grid search performance and allows the schedule to adaptively update over the course of training.

Variational Inference

Semi-supervised Sequential Generative Models

no code implementations30 Jun 2020 Michael Teng, Tuan Anh Le, Adam Scibior, Frank Wood

We introduce a novel objective for training deep generative time-series models with discrete latent variables for which supervision is only sparsely available.

Time Series

Enhancing Few-Shot Image Classification with Unlabelled Examples

no code implementations17 Jun 2020 Peyman Bateni, Jarred Barber, Jan-Willem van de Meent, Frank Wood

We develop a transductive meta-learning method that uses unlabelled instances to improve few-shot image classification performance.

Classification Few-Shot Image Classification +1

Planning as Inference in Epidemiological Models

1 code implementation30 Mar 2020 Frank Wood, Andrew Warrington, Saeid Naderiparizi, Christian Weilbach, Vaden Masrani, William Harvey, Adam Scibior, Boyan Beronov, John Grefenstette, Duncan Campbell, Ali Nasseri

In this work we demonstrate how to automate parts of the infectious disease-control policy-making process via performing inference in existing epidemiological models.

Probabilistic Programming

Coping With Simulators That Don't Always Return

1 code implementation28 Mar 2020 Andrew Warrington, Saeid Naderiparizi, Frank Wood

Deterministic models are approximations of reality that are easy to interpret and often easier to build than stochastic alternatives.

Improved Few-Shot Visual Classification

1 code implementation CVPR 2020 Peyman Bateni, Raghav Goyal, Vaden Masrani, Frank Wood, Leonid Sigal

Few-shot learning is a fundamental task in computer vision that carries the promise of alleviating the need for exhaustively labeled data.

Classification Few-Shot Image Classification +2

Attention for Inference Compilation

no code implementations25 Oct 2019 William Harvey, Andreas Munk, Atılım Güneş Baydin, Alexander Bergholm, Frank Wood

We present a new approach to automatic amortized inference in universal probabilistic programs which improves performance compared to current methods.

Deep Probabilistic Surrogate Networks for Universal Simulator Approximation

no code implementations25 Oct 2019 Andreas Munk, Adam Ścibior, Atılım Güneş Baydin, Andrew Stewart, Goran Fernlund, Anoush Poursartip, Frank Wood

We present a framework for automatically structuring and training fast, approximate, deep neural surrogates of existing stochastic simulators.

Probabilistic Programming

Safer End-to-End Autonomous Driving via Conditional Imitation Learning and Command Augmentation

no code implementations20 Sep 2019 Renhao Wang, Adam Scibior, Frank Wood

On top of that, we extend our model with an additional latent variable and augment the dataset to train a controller that is robust to unsafe commands, such as asking it to turn into a wall.

Autonomous Driving Imitation Learning

The Virtual Patch Clamp: Imputing C. elegans Membrane Potentials from Calcium Imaging

no code implementations24 Jul 2019 Andrew Warrington, Arthur Spencer, Frank Wood

We develop a stochastic whole-brain and body simulator of the nematode roundworm Caenorhabditis elegans (C. elegans) and show that it is sufficiently regularizing to allow imputation of latent membrane potentials from partial calcium fluorescence imaging observations.

Imputation

Amortized Monte Carlo Integration

1 code implementation18 Jul 2019 Adam Goliński, Frank Wood, Tom Rainforth

At runtime, samples are produced separately from each amortized proposal, before being combined to an overall estimate of the expectation.

Bayesian Inference

The Thermodynamic Variational Objective

1 code implementation NeurIPS 2019 Vaden Masrani, Tuan Anh Le, Frank Wood

We introduce the thermodynamic variational objective (TVO) for learning in both continuous and discrete deep generative models.

Variational Inference

Near-Optimal Glimpse Sequences for Improved Hard Attention Neural Network Training

no code implementations13 Jun 2019 William Harvey, Michael Teng, Frank Wood

We introduce methodology from the BOED literature to approximate this optimal behaviour, and use it to generate `near-optimal' sequences of attention locations.

General Classification Image Classification

Revisiting Reweighted Wake-Sleep

no code implementations ICLR 2019 Tuan Anh Le, Adam R. Kosiorek, N. Siddharth, Yee Whye Teh, Frank Wood

Discrete latent-variable models, while applicable in a variety of settings, can often be difficult to learn.

Latent Variable Models

Imitation Learning of Factored Multi-agent Reactive Models

no code implementations12 Mar 2019 Michael Teng, Tuan Anh Le, Adam Scibior, Frank Wood

We apply recent advances in deep generative modeling to the task of imitation learning from biological agents.

Imitation Learning

LF-PPL: A Low-Level First Order Probabilistic Programming Language for Non-Differentiable Models

1 code implementation6 Mar 2019 Yuan Zhou, Bradley J. Gram-Hansen, Tobias Kohn, Tom Rainforth, Hongseok Yang, Frank Wood

We develop a new Low-level, First-order Probabilistic Programming Language (LF-PPL) suited for models containing a mix of continuous, discrete, and/or piecewise-continuous variables.

Probabilistic Programming

Bayesian Distributed Stochastic Gradient Descent

no code implementations NeurIPS 2018 Michael Teng, Frank Wood

We introduce Bayesian distributed stochastic gradient descent (BDSGD), a high-throughput algorithm for training deep neural networks on parallel clusters.

An Introduction to Probabilistic Programming

2 code implementations27 Sep 2018 Jan-Willem van de Meent, Brooks Paige, Hongseok Yang, Frank Wood

In the context of this restricted PPL we introduce fundamental inference algorithms and describe how they can be implemented in the context of models denoted by probabilistic programs.

Probabilistic Programming

Efficient Probabilistic Inference in the Quest for Physics Beyond the Standard Model

2 code implementations NeurIPS 2019 Atılım Güneş Baydin, Lukas Heinrich, Wahid Bhimji, Lei Shao, Saeid Naderiparizi, Andreas Munk, Jialin Liu, Bradley Gram-Hansen, Gilles Louppe, Lawrence Meadows, Philip Torr, Victor Lee, Prabhat, Kyle Cranmer, Frank Wood

We present a novel probabilistic programming framework that couples directly to existing large-scale simulators through a cross-platform probabilistic execution protocol, which allows general-purpose inference engines to record and control random number draws within simulators in a language-agnostic way.

Probabilistic Programming

Inference Trees: Adaptive Inference with Exploration

no code implementations25 Jun 2018 Tom Rainforth, Yuan Zhou, Xiaoyu Lu, Yee Whye Teh, Frank Wood, Hongseok Yang, Jan-Willem van de Meent

We introduce inference trees (ITs), a new class of inference methods that build on ideas from Monte Carlo tree search to perform adaptive sampling in a manner that balances exploration with exploitation, ensures consistency, and alleviates pathologies in existing adaptive methods.

Deep Variational Reinforcement Learning for POMDPs

1 code implementation ICML 2018 Maximilian Igl, Luisa Zintgraf, Tuan Anh Le, Frank Wood, Shimon Whiteson

Many real-world sequential decision making problems are partially observable by nature, and the environment model is typically unknown.

Decision Making

Revisiting Reweighted Wake-Sleep for Models with Stochastic Control Flow

1 code implementation ICLR 2019 Tuan Anh Le, Adam R. Kosiorek, N. Siddharth, Yee Whye Teh, Frank Wood

Stochastic control-flow models (SCFMs) are a class of generative models that involve branching on choices from discrete random variables.

Hamiltonian Monte Carlo for Probabilistic Programs with Discontinuities

1 code implementation7 Apr 2018 Bradley Gram-Hansen, Yuan Zhou, Tobias Kohn, Tom Rainforth, Hongseok Yang, Frank Wood

Hamiltonian Monte Carlo (HMC) is arguably the dominant statistical inference algorithm used in most popular "first-order differentiable" Probabilistic Programming Languages (PPLs).

Probabilistic Programming

High Throughput Synchronous Distributed Stochastic Gradient Descent

no code implementations12 Mar 2018 Michael Teng, Frank Wood

We introduce a new, high-throughput, synchronous, distributed, data-parallel, stochastic-gradient-descent learning algorithm.

Tighter Variational Bounds are Not Necessarily Better

1 code implementation ICML 2018 Tom Rainforth, Adam R. Kosiorek, Tuan Anh Le, Chris J. Maddison, Maximilian Igl, Frank Wood, Yee Whye Teh

We provide theoretical and empirical evidence that using tighter evidence lower bounds (ELBOs) can be detrimental to the process of learning an inference network by reducing the signal-to-noise ratio of the gradient estimator.

Towards a Testable Notion of Generalization for Generative Adversarial Networks

no code implementations ICLR 2018 Robert Cornish, Hongseok Yang, Frank Wood

We consider the question of how to assess generative adversarial networks, in particular with respect to whether or not they generalise beyond memorising the training data.

Faithful Inversion of Generative Models for Effective Amortized Inference

1 code implementation NeurIPS 2018 Stefan Webb, Adam Golinski, Robert Zinkov, N. Siddharth, Tom Rainforth, Yee Whye Teh, Frank Wood

Inference amortization methods share information across multiple posterior-inference problems, allowing each to be carried out more efficiently.

Updating the VESICLE-CNN Synapse Detector

1 code implementation31 Oct 2017 Andrew Warrington, Frank Wood

The original implementation makes use of a patch-based approach.

On Nesting Monte Carlo Estimators

no code implementations ICML 2018 Tom Rainforth, Robert Cornish, Hongseok Yang, Andrew Warrington, Frank Wood

Many problems in machine learning and statistics involve nested expectations and thus do not permit conventional Monte Carlo (MC) estimation.

Bayesian Optimization for Probabilistic Programs

2 code implementations NeurIPS 2016 Tom Rainforth, Tuan Anh Le, Jan-Willem van de Meent, Michael A. Osborne, Frank Wood

We present the first general purpose framework for marginal maximum a posteriori estimation of probabilistic program variables.

Learning Disentangled Representations with Semi-Supervised Deep Generative Models

1 code implementation NeurIPS 2017 N. Siddharth, Brooks Paige, Jan-Willem van de Meent, Alban Desmaison, Noah D. Goodman, Pushmeet Kohli, Frank Wood, Philip H. S. Torr

We propose to learn such representations using model architectures that generalise from standard VAEs, employing a general graphical model structure in the encoder and decoder.

Representation Learning

Auto-Encoding Sequential Monte Carlo

1 code implementation ICLR 2018 Tuan Anh Le, Maximilian Igl, Tom Rainforth, Tom Jin, Frank Wood

We build on auto-encoding sequential Monte Carlo (AESMC): a method for model and proposal learning based on maximizing the lower bound to the log marginal likelihood in a broad family of structured probabilistic models.

Online Learning Rate Adaptation with Hypergradient Descent

3 code implementations ICLR 2018 Atilim Gunes Baydin, Robert Cornish, David Martinez Rubio, Mark Schmidt, Frank Wood

We introduce a general method for improving the convergence rate of gradient-based optimizers that is easy to implement and works well in practice.

Stochastic Optimization

Using Synthetic Data to Train Neural Networks is Model-Based Reasoning

no code implementations2 Mar 2017 Tuan Anh Le, Atilim Gunes Baydin, Robert Zinkov, Frank Wood

We draw a formal connection between using synthetic training data to optimize neural network parameters and approximate, Bayesian, model-based reasoning.

On the Pitfalls of Nested Monte Carlo

no code implementations3 Dec 2016 Tom Rainforth, Robert Cornish, Hongseok Yang, Frank Wood

In this paper, we analyse the behaviour of nested Monte Carlo (NMC) schemes, for which classical convergence proofs are insufficient.

Inducing Interpretable Representations with Variational Autoencoders

no code implementations22 Nov 2016 N. Siddharth, Brooks Paige, Alban Desmaison, Jan-Willem van de Meent, Frank Wood, Noah D. Goodman, Pushmeet Kohli, Philip H. S. Torr

We develop a framework for incorporating structured graphical models in the \emph{encoders} of variational autoencoders (VAEs) that allows us to induce interpretable representations through approximate variational inference.

General Classification Variational Inference

Probabilistic structure discovery in time series data

no code implementations21 Nov 2016 David Janz, Brooks Paige, Tom Rainforth, Jan-Willem van de Meent, Frank Wood

Existing methods for structure discovery in time series data construct interpretable, compositional kernels for Gaussian process regression models.

Time Series

Inference Compilation and Universal Probabilistic Programming

3 code implementations31 Oct 2016 Tuan Anh Le, Atilim Gunes Baydin, Frank Wood

We introduce a method for using deep neural networks to amortize the cost of inference in models from the family induced by universal probabilistic programming languages, establishing a framework that combines the strengths of probabilistic programming and deep learning methods.

Probabilistic Programming

Spreadsheet Probabilistic Programming

no code implementations14 Jun 2016 Mike Wu, Yura Perov, Frank Wood, Hongseok Yang

We demonstrate this by developing a native Excel implementation of both a particle Markov Chain Monte Carlo variant and black-box variational inference for spreadsheet probabilistic programming.

Decision Making Decision Making Under Uncertainty +2

Inference Networks for Sequential Monte Carlo in Graphical Models

1 code implementation22 Feb 2016 Brooks Paige, Frank Wood

We introduce a new approach for amortizing inference in directed graphical models by learning heuristic approximations to stochastic inverses, designed specifically for use as proposal distributions in sequential Monte Carlo methods.

Interacting Particle Markov Chain Monte Carlo

no code implementations16 Feb 2016 Tom Rainforth, Christian A. Naesseth, Fredrik Lindsten, Brooks Paige, Jan-Willem van de Meent, Arnaud Doucet, Frank Wood

We introduce interacting particle Markov chain Monte Carlo (iPMCMC), a PMCMC method based on an interacting pool of standard and conditional sequential Monte Carlo samplers.

Semantics for probabilistic programming: higher-order functions, continuous distributions, and soft constraints

no code implementations19 Jan 2016 Sam Staton, Hongseok Yang, Chris Heunen, Ohad Kammar, Frank Wood

We study the semantic foundation of expressive probabilistic programming languages, that support higher-order functions, continuous distributions, and soft constraints (such as Anglican, Church, and Venture).

Probabilistic Programming

Data-driven Sequential Monte Carlo in Probabilistic Programming

no code implementations14 Dec 2015 Yura N. Perov, Tuan Anh Le, Frank Wood

Most of Markov Chain Monte Carlo (MCMC) and sequential Monte Carlo (SMC) algorithms in existing probabilistic programming systems suboptimally use only model priors as proposal distributions.

Probabilistic Programming

Canonical Correlation Forests

3 code implementations20 Jul 2015 Tom Rainforth, Frank Wood

We introduce canonical correlation forests (CCFs), a new decision tree ensemble method for classification and regression.

General Classification

Black-Box Policy Search with Probabilistic Programs

no code implementations16 Jul 2015 Jan-Willem van de Meent, Brooks Paige, David Tolpin, Frank Wood

In this work, we explore how probabilistic programs can be used to represent policies in sequential decision problems.

A New Approach to Probabilistic Programming Inference

no code implementations3 Jul 2015 Frank Wood, Jan Willem van de Meent, Vikash Mansinghka

We introduce and demonstrate a new approach to inference in expressive probabilistic programming languages based on particle Markov chain Monte Carlo.

Probabilistic Programming

Maximum a Posteriori Estimation by Search in Probabilistic Programs

no code implementations26 Apr 2015 David Tolpin, Frank Wood

We introduce an approximate search algorithm for fast maximum a posteriori probability estimation in probabilistic programs, which we call Bayesian ascent Monte Carlo (BaMC).

Path Finding under Uncertainty through Probabilistic Inference

no code implementations25 Feb 2015 David Tolpin, Brooks Paige, Jan Willem van de Meent, Frank Wood

We introduce a new approach to solving path-finding problems under uncertainty by representing them as probabilistic models and applying domain-independent inference algorithms to the models.

Particle Gibbs with Ancestor Sampling for Probabilistic Programs

no code implementations27 Jan 2015 Jan-Willem van de Meent, Hongseok Yang, Vikash Mansinghka, Frank Wood

Particle Markov chain Monte Carlo techniques rank among current state-of-the-art methods for probabilistic program inference.

Probabilistic Programming

Output-Sensitive Adaptive Metropolis-Hastings for Probabilistic Programs

no code implementations22 Jan 2015 David Tolpin, Jan Willem van de Meent, Brooks Paige, Frank Wood

We introduce an adaptive output-sensitive Metropolis-Hastings algorithm for probabilistic models expressed as programs, Adaptive Lightweight Metropolis-Hastings (AdLMH).

Asynchronous Anytime Sequential Monte Carlo

no code implementations NeurIPS 2014 Brooks Paige, Frank Wood, Arnaud Doucet, Yee Whye Teh

We introduce a new sequential Monte Carlo algorithm we call the particle cascade.

Infinite Structured Hidden Semi-Markov Models

no code implementations30 Jun 2014 Jonathan H. Huggins, Frank Wood

This paper reviews recent advances in Bayesian nonparametric techniques for constructing and performing inference in infinite hidden Markov models.

A Compilation Target for Probabilistic Programming Languages

no code implementations3 Mar 2014 Brooks Paige, Frank Wood

Forward inference techniques such as sequential Monte Carlo and particle Markov chain Monte Carlo for probabilistic programming can be implemented in any programming language by creative use of standardized operating system functionality including processes, forking, mutexes, and shared memory.

Probabilistic Programming

Tempering by Subsampling

no code implementations28 Jan 2014 Jan-Willem van de Meent, Brooks Paige, Frank Wood

In this paper we demonstrate that tempering Markov chain Monte Carlo samplers for Bayesian models by recursively subsampling observations without replacement can improve the performance of baseline samplers in terms of effective sample size per computation.

Hierarchically-coupled hidden Markov models for learning kinetic rates from single-molecule data

no code implementations15 May 2013 Jan-Willem van de Meent, Jonathan E. Bronson, Frank Wood, Ruben L. Gonzalez Jr., Chris H. Wiggins

We address the problem of analyzing sets of noisy time-varying signals that all report on the same process but confound straightforward analyses due to complex inter-signal heterogeneities and measurement artifacts.

Time Series

Inferring Team Strengths Using a Discrete Markov Random Field

no code implementations9 May 2013 John Zech, Frank Wood

We propose an original model for inferring team strengths using a Markov Random Field, which can be used to generate historical estimates of the offensive and defensive strengths of a team over time.

Hierarchically Supervised Latent Dirichlet Allocation

no code implementations NeurIPS 2011 Adler J. Perotte, Frank Wood, Noemie Elhadad, Nicholas Bartlett

We introduce hierarchically supervised latent Dirichlet allocation (HSLDA), a model for hierarchically and multiply labeled bag-of-word data.

Product Categorization

Probabilistic Deterministic Infinite Automata

no code implementations NeurIPS 2010 David Pfau, Nicholas Bartlett, Frank Wood

We suggest that our method for averaging over PDFAs is a novel approach to predictive distribution smoothing.

Dependent Dirichlet Process Spike Sorting

no code implementations NeurIPS 2008 Jan Gasthaus, Frank Wood, Dilan Gorur, Yee W. Teh

In this paper we propose a new incremental spike sorting model that automatically eliminates refractory period violations, accounts for action potential waveform drift, and can handle appearance" and "disappearance" of neurons.

Spike Sorting

Characterizing neural dependencies with copula models

no code implementations NeurIPS 2008 Pietro Berkes, Frank Wood, Jonathan W. Pillow

The coding of information by neural populations depends critically on the statistical dependencies between neuronal responses.

Cannot find the paper you are looking for? You can Submit a new open access paper.