Gaussian Processes

568 papers with code • 1 benchmarks • 5 datasets

Gaussian Processes is a powerful framework for several machine learning tasks such as regression, classification and inference. Given a finite set of input output training data that is generated out of a fixed (but possibly unknown) function, the framework models the unknown function as a stochastic process such that the training outputs are a finite number of jointly Gaussian random variables, whose properties can then be used to infer the statistics (the mean and variance) of the function at test values of input.

Source: Sequential Randomized Matrix Factorization for Gaussian Processes: Efficient Predictions and Hyper-parameter Optimization

Libraries

Use these libraries to find Gaussian Processes models and implementations

Subtasks


Most implemented papers

Scalable Bayesian Optimization Using Deep Neural Networks

automl/pybnn 19 Feb 2015

Bayesian optimization is an effective methodology for the global optimization of functions with expensive evaluations.

Convolutional Gaussian Processes

markvdw/convgp NeurIPS 2017

We present a practical way of introducing convolutional structure into Gaussian processes, making them more suited to high-dimensional inputs like images.

Probabilistic Recurrent State-Space Models

andreasdoerr/PR-SSM ICML 2018

State-space models (SSMs) are a highly expressive model class for learning patterns in time series data and for system identification.

Differentiable Compositional Kernel Learning for Gaussian Processes

hughsalimbeni/bayesian_benchmarks ICML 2018

The NKN architecture is based on the composition rules for kernels, so that each unit of the network corresponds to a valid kernel.

GPyTorch: Blackbox Matrix-Matrix Gaussian Process Inference with GPU Acceleration

cornellius-gp/gpytorch NeurIPS 2018

Despite advances in scalable models, the inference tools used for Gaussian processes (GPs) have yet to fully capitalize on developments in computing hardware.

Pre-trained Gaussian processes for Bayesian optimization

google-research/hyperbo 16 Sep 2021

Contrary to a common expectation that BO is suited to optimizing black-box functions, it actually requires domain knowledge about those functions to deploy BO successfully.

Kernel Interpolation for Scalable Structured Gaussian Processes (KISS-GP)

wjmaddox/online_gp 3 Mar 2015

We introduce a new structured kernel interpolation (SKI) framework, which generalises and unifies inducing point methods for scalable Gaussian processes (GPs).

Thoughts on Massively Scalable Gaussian Processes

alshedivat/keras-gp 5 Nov 2015

This multi-level circulant approximation allows one to unify the orthogonal computational benefits of fast Kronecker and Toeplitz approaches, and is significantly faster than either approach in isolation; 2) local kernel interpolation and inducing points to allow for arbitrarily located data inputs, and $O(1)$ test time predictions; 3) exploiting block-Toeplitz Toeplitz-block structure (BTTB), which enables fast inference and learning when multidimensional Kronecker structure is not present; and 4) projections of the input space to flexibly model correlated inputs and high dimensional data.

Scalable Log Determinants for Gaussian Process Kernel Learning

kd383/GPML_SLD NeurIPS 2017

For applications as varied as Bayesian neural networks, determinantal point processes, elliptical graphical models, and kernel learning for Gaussian processes (GPs), one must compute a log determinant of an $n \times n$ positive definite matrix, and its derivatives - leading to prohibitive $\mathcal{O}(n^3)$ computations.

Inference in Deep Gaussian Processes using Stochastic Gradient Hamiltonian Monte Carlo

cambridge-mlg/sghmc_dgp NeurIPS 2018

The current state-of-the-art inference method, Variational Inference (VI), employs a Gaussian approximation to the posterior distribution.