Search Results for author: Wessel P. Bruinsma

Found 13 papers, 9 papers with code

Autoregressive Conditional Neural Processes

1 code implementation25 Mar 2023 Wessel P. Bruinsma, Stratis Markou, James Requiema, Andrew Y. K. Foong, Tom R. Andersson, Anna Vaughan, Anthony Buonomo, J. Scott Hosking, Richard E. Turner

Our work provides an example of how ideas from neural distribution estimation can benefit neural processes, and motivates research into the AR deployment of other neural process models.

Meta-Learning

Sparse Gaussian Process Hyperparameters: Optimize or Integrate?

no code implementations4 Nov 2022 Vidhi Lalchand, Wessel P. Bruinsma, David R. Burt, Carl E. Rasmussen

In this work we propose an algorithm for sparse Gaussian process regression which leverages MCMC to sample from the hyperparameter posterior within the variational inducing point framework of Titsias (2009).

Model Selection

A Note on the Chernoff Bound for Random Variables in the Unit Interval

no code implementations15 May 2022 Andrew Y. K. Foong, Wessel P. Bruinsma, David R. Burt

The Chernoff bound is a well-known tool for obtaining a high probability bound on the expectation of a Bernoulli random variable in terms of its sample average.

Learning Theory

Practical Conditional Neural Processes Via Tractable Dependent Predictions

no code implementations16 Mar 2022 Stratis Markou, James Requeima, Wessel P. Bruinsma, Anna Vaughan, Richard E. Turner

Existing approaches which model output dependencies, such as Neural Processes (NPs; Garnelo et al., 2018b) or the FullConvGNP (Bruinsma et al., 2021), are either complicated to train or prohibitively expensive.

Decision Making Meta-Learning

Modelling Non-Smooth Signals with Complex Spectral Structure

1 code implementation14 Mar 2022 Wessel P. Bruinsma, Martin Tegnér, Richard E. Turner

The Gaussian Process Convolution Model (GPCM; Tobar et al., 2015a) is a model for signals with complex spectral structure.

Variational Inference

Wide Mean-Field Bayesian Neural Networks Ignore the Data

1 code implementation23 Feb 2022 Beau Coker, Wessel P. Bruinsma, David R. Burt, Weiwei Pan, Finale Doshi-Velez

Finally, we show that the optimal approximate posterior need not tend to the prior if the activation function is not odd, showing that our statements cannot be generalized arbitrarily.

Variational Inference

How Tight Can PAC-Bayes be in the Small Data Regime?

1 code implementation NeurIPS 2021 Andrew Y. K. Foong, Wessel P. Bruinsma, David R. Burt, Richard E. Turner

Interestingly, this lower bound recovers the Chernoff test set bound if the posterior is equal to the prior.

The Gaussian Neural Process

1 code implementation pproximateinference AABI Symposium 2021 Wessel P. Bruinsma, James Requeima, Andrew Y. K. Foong, Jonathan Gordon, Richard E. Turner

Neural Processes (NPs; Garnelo et al., 2018a, b) are a rich class of models for meta-learning that map data sets directly to predictive stochastic processes.

Meta-Learning Translation

Scalable Exact Inference in Multi-Output Gaussian Processes

1 code implementation ICML 2020 Wessel P. Bruinsma, Eric Perim, Will Tebbutt, J. Scott Hosking, Arno Solin, Richard E. Turner

Multi-output Gaussian processes (MOGPs) leverage the flexibility and interpretability of GPs while capturing structure across outputs, which is desirable, for example, in spatio-temporal modelling.

Gaussian Processes

Convolutional Conditional Neural Processes

3 code implementations ICLR 2020 Jonathan Gordon, Wessel P. Bruinsma, Andrew Y. K. Foong, James Requeima, Yann Dubois, Richard E. Turner

We introduce the Convolutional Conditional Neural Process (ConvCNP), a new member of the Neural Process family that models translation equivariance in the data.

Inductive Bias Time Series +3

Cannot find the paper you are looking for? You can Submit a new open access paper.