Search Results for author: Brooks Paige

Found 44 papers, 20 papers with code

Right Now, Wrong Then: Non-Stationary Direct Preference Optimization under Preference Drift

no code implementations26 Jul 2024 Seongho Son, William Bankes, Sayak Ray Chowdhury, Brooks Paige, Ilija Bogunovic

We theoretically analyse the convergence of NS-DPO in the offline setting, providing upper bounds on the estimation error caused by non-stationary preferences.

AsEP: Benchmarking Deep Learning Methods for Antibody-specific Epitope Prediction

1 code implementation25 Jul 2024 Chunan Liu, Lilian Denzler, Yihong Chen, Andrew Martin, Brooks Paige

To address this, we propose a novel method, WALLE, which leverages both unstructured modeling from protein language models and structural modeling from graph neural networks.

Benchmarking Deep Learning +1

Analyzing the Generalization and Reliability of Steering Vectors

no code implementations17 Jul 2024 Daniel Tan, David Chanin, Aengus Lynch, Dimitrios Kanoulas, Brooks Paige, Adria Garriga-Alonso, Robert Kirk

In this work, we rigorously investigate these properties, and show that steering vectors have substantial limitations both in- and out-of-distribution.

Language Modelling

Can a Confident Prior Replace a Cold Posterior?

1 code implementation2 Mar 2024 Martin Marek, Brooks Paige, Pavel Izmailov

First, we introduce a "DirClip" prior that is practical to sample and nearly matches the performance of a cold posterior.

Image Classification

Diffusive Gibbs Sampling

1 code implementation5 Feb 2024 Wenlin Chen, Mingtian Zhang, Brooks Paige, José Miguel Hernández-Lobato, David Barber

The inadequate mixing of conventional Markov Chain Monte Carlo (MCMC) methods for multi-modal distributions presents a significant challenge in practical applications such as Bayesian inference and molecular dynamics.

Bayesian Inference Denoising

Gaussian Processes on Cellular Complexes

no code implementations2 Nov 2023 Mathieu Alain, So Takao, Brooks Paige, Marc Peter Deisenroth

In this paper, we go beyond this dyadic setting and consider polyadic relations that include interactions between vertices, edges and one of their generalisations, known as cells.

Gaussian Processes

Moment Matching Denoising Gibbs Sampling

1 code implementation NeurIPS 2023 Mingtian Zhang, Alex Hawkins-Hooker, Brooks Paige, David Barber

Energy-Based Models (EBMs) offer a versatile framework for modeling complex data distributions.

Denoising

Towards Healing the Blindness of Score Matching

no code implementations15 Sep 2022 Mingtian Zhang, Oscar Key, Peter Hayes, David Barber, Brooks Paige, François-Xavier Briol

Score-based divergences have been widely used in machine learning and statistics applications.

Density Estimation

Improving VAE-based Representation Learning

no code implementations28 May 2022 Mingtian Zhang, Tim Z. Xiao, Brooks Paige, David Barber

Latent variable models like the Variational Auto-Encoder (VAE) are commonly used to learn representations of images.

Decoder Representation Learning

Fast and Scalable Spike and Slab Variable Selection in High-Dimensional Gaussian Processes

1 code implementation8 Nov 2021 Hugh Dance, Brooks Paige

Variable selection in Gaussian processes (GPs) is typically undertaken by thresholding the inverse lengthscales of automatic relevance determination kernels, but in high-dimensional datasets this approach can be unreliable.

Gaussian Processes Variable Selection +1

I Don't Need u: Identifiable Non-Linear ICA Without Side Information

1 code implementation9 Jun 2021 Matthew Willetts, Brooks Paige

Surprisingly, we discover side information is not necessary for algorithmic stability: using standard quantitative measures of identifiability, we find deep generative models with latent clusterings are empirically identifiable to the same degree as models which rely on auxiliary labels.

Clustering Representation Learning

Barking up the right tree: an approach to search over molecule synthesis DAGs

1 code implementation NeurIPS 2020 John Bradshaw, Brooks Paige, Matt J. Kusner, Marwin H. S. Segler, José Miguel Hernández-Lobato

When designing new molecules with particular properties, it is not only important what to make but crucially how to make it.

Bayesian Graph Neural Networks for Molecular Property Prediction

1 code implementation25 Nov 2020 George Lamb, Brooks Paige

Graph neural networks for molecular property prediction are frequently underspecified by data and fail to generalise to new scaffolds at test time.

Molecular Property Prediction Property Prediction +1

Making Graph Neural Networks Worth It for Low-Data Molecular Machine Learning

no code implementations24 Nov 2020 Aneesh Pappu, Brooks Paige

When we find that they are not, we explore pretraining and the meta-learning method MAML (and variants FO-MAML and ANIL) for improving graph neural network performance by transfer learning from related tasks.

BIG-bench Machine Learning Graph Neural Network +2

Goal-directed Generation of Discrete Structures with Conditional Generative Models

no code implementations NeurIPS 2020 Amina Mollaysa, Brooks Paige, Alexandros Kalousis

Unfortunately, maximum likelihood training of such models often fails with the samples from the generative model inadequately respecting the input properties.

Program Synthesis reinforcement-learning +2

Relating by Contrasting: A Data-efficient Framework for Multimodal Generative Models

no code implementations ICLR 2021 Yuge Shi, Brooks Paige, Philip H. S. Torr, N. Siddharth

Multimodal learning for generative models often refers to the learning of abstract concepts from the commonality of information in multiple modalities, such as vision and language.

Learning Bijective Feature Maps for Linear ICA

no code implementations18 Feb 2020 Alexander Camuto, Matthew Willetts, Brooks Paige, Chris Holmes, Stephen Roberts

Separating high-dimensional data like images into independent latent factors, i. e independent component analysis (ICA), remains an open research problem.

Variational Mixture-of-Experts Autoencoders for Multi-Modal Deep Generative Models

3 code implementations NeurIPS 2019 Yuge Shi, N. Siddharth, Brooks Paige, Philip H. S. Torr

In this work, we characterise successful learning of such models as the fulfillment of four criteria: i) implicit latent decomposition into shared and private subspaces, ii) coherent joint generation over all modalities, iii) coherent cross-generation across individual modalities, and iv) improved model learning for individual modalities through multi-modal integration.

Data Generation for Neural Programming by Example

1 code implementation6 Nov 2019 Judith Clymo, Haik Manukian, Nathanaël Fijalkow, Adrià Gascón, Brooks Paige

A particular challenge lies in generating meaningful sets of inputs and outputs, which well-characterize a given program and accurately demonstrate its behavior.

BIG-bench Machine Learning Synthetic Data Generation

Conditional generation of molecules from disentangled representations

no code implementations25 Sep 2019 Amina Mollaysa, Brooks Paige, Alexandros Kalousis

Though machine learning approaches have shown great success in estimating properties of small molecules, the inverse problem of generating molecules with desired properties remains challenging.

Style Transfer

A Model to Search for Synthesizable Molecules

1 code implementation NeurIPS 2019 John Bradshaw, Brooks Paige, Matt J. Kusner, Marwin H. S. Segler, José Miguel Hernández-Lobato

Deep generative models are able to suggest new organic molecules by generating strings, trees, and graphs representing their structure.

Retrosynthesis valid

Generating Molecules via Chemical Reactions

no code implementations ICLR Workshop DeepGenStruct 2019 John Bradshaw, Matt J. Kusner, Brooks Paige, Marwin H. S. Segler, José Miguel Hernández-Lobato

We therefore propose a new molecule generation model, mirroring a more realistic real-world process, where reactants are selected and combined to form more complex molecules.

Retrosynthesis valid

An Introduction to Probabilistic Programming

3 code implementations27 Sep 2018 Jan-Willem van de Meent, Brooks Paige, Hongseok Yang, Frank Wood

We start with a discussion of model-based reasoning and explain why conditioning is a foundational computation central to the fields of probabilistic machine learning and artificial intelligence.

Probabilistic Programming

Take a Look Around: Using Street View and Satellite Images to Estimate House Prices

no code implementations18 Jul 2018 Stephen Law, Brooks Paige, Chris Russell

Not only do few quantitative methods exist that can measure the urban environment, but that the collection of such data is both costly and subjective.

Learning a Generative Model for Validity in Complex Discrete Structures

1 code implementation ICLR 2018 David Janz, Jos van der Westhuizen, Brooks Paige, Matt J. Kusner, José Miguel Hernández-Lobato

This validator provides insight as to how individual sequence elements influence the validity of the overall sequence, and can be used to constrain sequence based models to generate valid sequences -- and thus faithfully model discrete objects.

Reinforcement Learning valid

Learning Disentangled Representations with Semi-Supervised Deep Generative Models

1 code implementation NeurIPS 2017 N. Siddharth, Brooks Paige, Jan-Willem van de Meent, Alban Desmaison, Noah D. Goodman, Pushmeet Kohli, Frank Wood, Philip H. S. Torr

We propose to learn such representations using model architectures that generalise from standard VAEs, employing a general graphical model structure in the encoder and decoder.

Decoder Representation Learning

Inducing Interpretable Representations with Variational Autoencoders

no code implementations22 Nov 2016 N. Siddharth, Brooks Paige, Alban Desmaison, Jan-Willem van de Meent, Frank Wood, Noah D. Goodman, Pushmeet Kohli, Philip H. S. Torr

We develop a framework for incorporating structured graphical models in the \emph{encoders} of variational autoencoders (VAEs) that allows us to induce interpretable representations through approximate variational inference.

General Classification Variational Inference

Probabilistic structure discovery in time series data

no code implementations21 Nov 2016 David Janz, Brooks Paige, Tom Rainforth, Jan-Willem van de Meent, Frank Wood

Existing methods for structure discovery in time series data construct interpretable, compositional kernels for Gaussian process regression models.

regression Time Series +1

Inference Networks for Sequential Monte Carlo in Graphical Models

1 code implementation22 Feb 2016 Brooks Paige, Frank Wood

We introduce a new approach for amortizing inference in directed graphical models by learning heuristic approximations to stochastic inverses, designed specifically for use as proposal distributions in sequential Monte Carlo methods.

Interacting Particle Markov Chain Monte Carlo

1 code implementation16 Feb 2016 Tom Rainforth, Christian A. Naesseth, Fredrik Lindsten, Brooks Paige, Jan-Willem van de Meent, Arnaud Doucet, Frank Wood

We introduce interacting particle Markov chain Monte Carlo (iPMCMC), a PMCMC method based on an interacting pool of standard and conditional sequential Monte Carlo samplers.

Kernel Sequential Monte Carlo

1 code implementation11 Oct 2015 Ingmar Schuster, Heiko Strathmann, Brooks Paige, Dino Sejdinovic

As KSMC does not require access to target gradients, it is particularly applicable on targets whose gradients are unknown or prohibitively expensive.

Black-Box Policy Search with Probabilistic Programs

1 code implementation16 Jul 2015 Jan-Willem van de Meent, Brooks Paige, David Tolpin, Frank Wood

In this work, we explore how probabilistic programs can be used to represent policies in sequential decision problems.

Path Finding under Uncertainty through Probabilistic Inference

no code implementations25 Feb 2015 David Tolpin, Brooks Paige, Jan Willem van de Meent, Frank Wood

We introduce a new approach to solving path-finding problems under uncertainty by representing them as probabilistic models and applying domain-independent inference algorithms to the models.

Output-Sensitive Adaptive Metropolis-Hastings for Probabilistic Programs

1 code implementation22 Jan 2015 David Tolpin, Jan Willem van de Meent, Brooks Paige, Frank Wood

We introduce an adaptive output-sensitive Metropolis-Hastings algorithm for probabilistic models expressed as programs, Adaptive Lightweight Metropolis-Hastings (AdLMH).

Asynchronous Anytime Sequential Monte Carlo

no code implementations NeurIPS 2014 Brooks Paige, Frank Wood, Arnaud Doucet, Yee Whye Teh

We introduce a new sequential Monte Carlo algorithm we call the particle cascade.

A Compilation Target for Probabilistic Programming Languages

no code implementations3 Mar 2014 Brooks Paige, Frank Wood

Forward inference techniques such as sequential Monte Carlo and particle Markov chain Monte Carlo for probabilistic programming can be implemented in any programming language by creative use of standardized operating system functionality including processes, forking, mutexes, and shared memory.

Probabilistic Programming

Tempering by Subsampling

no code implementations28 Jan 2014 Jan-Willem van de Meent, Brooks Paige, Frank Wood

In this paper we demonstrate that tempering Markov chain Monte Carlo samplers for Bayesian models by recursively subsampling observations without replacement can improve the performance of baseline samplers in terms of effective sample size per computation.

Cannot find the paper you are looking for? You can Submit a new open access paper.