Marginalised Spectral Mixture Kernels with Nested Sampling

Gaussian Process (GPs) models are a rich distribution over functions with inductive biases controlled by a kernel function. Learning occurs through optimisation of the kernel hyperparameters using the marginal likelihood as the objective (ML-II). This work analyses the benefits of marginalising kernel hyperparameters using nested sampling (NS), a technique well-suited to sample from complex, multi-modal distributions. We benchmark against Hamiltonian Monte Carlo (HMC) on time-series regression tasks and find that a principled approach to quantifying hyperparameter uncertainty substantially improves the quality of prediction intervals.

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here