1 code implementation • 12 Apr 2021 • Vincent Dutordoir, Hugh Salimbeni, Eric Hambro, John McLeod, Felix Leibfried, Artem Artemev, Mark van der Wilk, James Hensman, Marc P. Deisenroth, ST John
GPflux is compatible with and built on top of the Keras deep learning eco-system.
1 code implementation • ICML 2020 • Martin Jørgensen, Marc Peter Deisenroth, Hugh Salimbeni
We present a Bayesian non-parametric way of inferring stochastic differential equations for both regression tasks and continuous-time dynamical modelling.
1 code implementation • 14 May 2019 • Hugh Salimbeni, Vincent Dutordoir, James Hensman, Marc Peter Deisenroth
Deep Gaussian processes (DGPs) can model complex marginal densities as well as complex mappings.
no code implementations • NeurIPS 2018 • Vincent Dutordoir, Hugh Salimbeni, Marc Deisenroth, James Hensman
Conditional Density Estimation (CDE) models deal with estimating conditional distributions.
1 code implementation • NeurIPS 2018 • Hugh Salimbeni, Ching-An Cheng, Byron Boots, Marc Deisenroth
It adopts an orthogonal basis in the mean function to model the residues that cannot be learned by the standard coupled approach.
no code implementations • 24 Mar 2018 • Hugh Salimbeni, Stefanos Eleftheriadis, James Hensman
The natural gradient method has been used effectively in conjugate Gaussian process models, but the non-conjugate case has been largely unexplored.
8 code implementations • NeurIPS 2017 • Hugh Salimbeni, Marc Deisenroth
Existing approaches to inference in DGP models assume approximate posteriors that force independence between the layers, and do not work well in practice.
3 code implementations • 8 Nov 2016 • Nat Dilokthanakul, Pedro A. M. Mediano, Marta Garnelo, Matthew C. H. Lee, Hugh Salimbeni, Kai Arulkumaran, Murray Shanahan
We study a variant of the variational autoencoder model (VAE) with a Gaussian mixture as a prior distribution, with the goal of performing unsupervised clustering through deep generative models.
Ranked #7 on Human Pose Forecasting on HumanEva-I