no code implementations • 30 Jan 2020 • Gonzalo Rios
Gaussian process (GP) priors are non-parametric generative models with appealing modelling properties for Bayesian inference: they can model non-linear relationships through noisy observations, have closed-form expressions for training and inference, and are governed by interpretable hyperparameters.
no code implementations • 23 Jun 2019 • Gonzalo Rios, Felipe Tobar
The Gaussian process (GP) is a nonparametric prior distribution over functions indexed by time, space, or other high-dimensional index set.
no code implementations • 28 May 2018 • Julio Backhoff-Veraguas, Joaquin Fontbona, Gonzalo Rios, Felipe Tobar
We introduce and study a novel model-selection strategy for Bayesian learning, based on optimal transport, along with its associated predictive posterior law: the Wasserstein population barycenter of the posterior law over models.
no code implementations • 19 Mar 2018 • Gonzalo Rios, Felipe Tobar
Gaussian processes (GPs) are Bayesian nonparametric generative models that provide interpretability of hyperparameters, admit closed-form expressions for training and inference, and are able to accurately represent uncertainty.
no code implementations • 19 Jul 2017 • Felipe Tobar, Gonzalo Rios, Tomás Valdivia, Pablo Guerrero
The proposed model is validated in the recovery of three signals: a smooth synthetic signal, a real-world heart-rate time series and a step function, where GPMM outperformed the standard GP in terms of estimation error, uncertainty representation and recovery of the spectral content of the latent signal.