Search Results for author: Carl E. Rasmussen

Found 11 papers, 4 papers with code

Sparse Gaussian Process Hyperparameters: Optimize or Integrate?

no code implementations4 Nov 2022 Vidhi Lalchand, Wessel P. Bruinsma, David R. Burt, Carl E. Rasmussen

In this work we propose an algorithm for sparse Gaussian process regression which leverages MCMC to sample from the hyperparameter posterior within the variational inducing point framework of Titsias (2009).

Model Selection

The Promises and Pitfalls of Deep Kernel Learning

no code implementations24 Feb 2021 Sebastian W. Ober, Carl E. Rasmussen, Mark van der Wilk

Through careful experimentation on the UCI, CIFAR-10, and the UTKFace datasets, we find that the overfitting from overparameterized maximum marginal likelihood, in which the model is "somewhat Bayesian", can in certain scenarios be worse than that from not being Bayesian at all.

Gaussian Processes

Deep Structured Mixtures of Gaussian Processes

1 code implementation10 Oct 2019 Martin Trapp, Robert Peharz, Franz Pernkopf, Carl E. Rasmussen

Gaussian Processes (GPs) are powerful non-parametric Bayesian regression models that allow exact posterior inference, but exhibit high computational and memory costs.

Gaussian Processes

Rates of Convergence for Sparse Variational Gaussian Process Regression

1 code implementation8 Mar 2019 David R. Burt, Carl E. Rasmussen, Mark van der Wilk

Excellent variational approximations to Gaussian process posteriors have been developed which avoid the $\mathcal{O}\left(N^3\right)$ scaling with dataset size $N$.

Continual Learning regression

Learning Deep Mixtures of Gaussian Process Experts Using Sum-Product Networks

1 code implementation12 Sep 2018 Martin Trapp, Robert Peharz, Carl E. Rasmussen, Franz Pernkopf

In this paper, we introduce a natural and expressive way to tackle these problems, by incorporating GPs in sum-product networks (SPNs), a recently proposed tractable probabilistic model allowing exact and efficient inference.

Gaussian Processes regression +1

Variational Gaussian Process State-Space Models

no code implementations NeurIPS 2014 Roger Frigola, Yutian Chen, Carl E. Rasmussen

State-space models have been successfully used for more than fifty years in different areas of science and engineering.

Gaussian Processes Time Series +2

Distributed Variational Inference in Sparse Gaussian Process Regression and Latent Variable Models

1 code implementation NeurIPS 2014 Yarin Gal, Mark van der Wilk, Carl E. Rasmussen

We show that GP performance improves with increasing amounts of data in regression (on flight data with 2 million records) and latent variable modelling (on MNIST).

Dimensionality Reduction Gaussian Processes +2

Identification of Gaussian Process State-Space Models with Particle Stochastic Approximation EM

no code implementations17 Dec 2013 Roger Frigola, Fredrik Lindsten, Thomas B. Schön, Carl E. Rasmussen

Gaussian process state-space models (GP-SSMs) are a very flexible family of models of nonlinear dynamical systems.

Gaussian Process Training with Input Noise

no code implementations NeurIPS 2011 Andrew Mchutchon, Carl E. Rasmussen

This allows the input noise to be recast as output noise proportional to the squared gradient of the GP posterior mean.

regression

Cannot find the paper you are looking for? You can Submit a new open access paper.