Search Results for author: Hugh Salimbeni

Found 8 papers, 6 papers with code

Stochastic Differential Equations with Variational Wishart Diffusions

1 code implementation ICML 2020 Martin Jørgensen, Marc Peter Deisenroth, Hugh Salimbeni

We present a Bayesian non-parametric way of inferring stochastic differential equations for both regression tasks and continuous-time dynamical modelling.

regression

Orthogonally Decoupled Variational Gaussian Processes

1 code implementation NeurIPS 2018 Hugh Salimbeni, Ching-An Cheng, Byron Boots, Marc Deisenroth

It adopts an orthogonal basis in the mean function to model the residues that cannot be learned by the standard coupled approach.

Gaussian Processes Variational Inference

Natural Gradients in Practice: Non-Conjugate Variational Inference in Gaussian Process Models

no code implementations24 Mar 2018 Hugh Salimbeni, Stefanos Eleftheriadis, James Hensman

The natural gradient method has been used effectively in conjugate Gaussian process models, but the non-conjugate case has been largely unexplored.

Variational Inference

Doubly Stochastic Variational Inference for Deep Gaussian Processes

8 code implementations NeurIPS 2017 Hugh Salimbeni, Marc Deisenroth

Existing approaches to inference in DGP models assume approximate posteriors that force independence between the layers, and do not work well in practice.

Gaussian Processes General Classification +2

Deep Unsupervised Clustering with Gaussian Mixture Variational Autoencoders

3 code implementations8 Nov 2016 Nat Dilokthanakul, Pedro A. M. Mediano, Marta Garnelo, Matthew C. H. Lee, Hugh Salimbeni, Kai Arulkumaran, Murray Shanahan

We study a variant of the variational autoencoder model (VAE) with a Gaussian mixture as a prior distribution, with the goal of performing unsupervised clustering through deep generative models.

Clustering Human Pose Forecasting

Cannot find the paper you are looking for? You can Submit a new open access paper.