Gaussian Processes

352 papers with code • 0 benchmarks • 4 datasets

Gaussian Processes is a powerful framework for several machine learning tasks such as regression, classification and inference. Given a finite set of input output training data that is generated out of a fixed (but possibly unknown) function, the framework models the unknown function as a stochastic process such that the training outputs are a finite number of jointly Gaussian random variables, whose properties can then be used to infer the statistics (the mean and variance) of the function at test values of input.

Source: Sequential Randomized Matrix Factorization for Gaussian Processes: Efficient Predictions and Hyper-parameter Optimization

Subtasks


Greatest papers with code

Convolutional Gaussian Processes

pyro-ppl/pyro NeurIPS 2017

We present a practical way of introducing convolutional structure into Gaussian processes, making them more suited to high-dimensional inputs like images.

Gaussian Processes

Doubly Stochastic Variational Inference for Deep Gaussian Processes

pyro-ppl/pyro NeurIPS 2017

Existing approaches to inference in DGP models assume approximate posteriors that force independence between the layers, and do not work well in practice.

Gaussian Processes General Classification +1

Adversarial Robustness Toolbox v1.0.0

IBM/adversarial-robustness-toolbox 3 Jul 2018

Defending Machine Learning models involves certifying and verifying model robustness and model hardening with approaches such as pre-processing inputs, augmenting training data with adversarial samples, and leveraging runtime detection methods to flag any inputs that might have been modified by an adversary.

Adversarial Robustness Gaussian Processes +1

Constant-Time Predictive Distributions for Gaussian Processes

cornellius-gp/gpytorch ICML 2018

One of the most compelling features of Gaussian process (GP) regression is its ability to provide well-calibrated posterior distributions.

Gaussian Processes

Exact Gaussian Processes on a Million Data Points

cornellius-gp/gpytorch NeurIPS 2019

Gaussian processes (GPs) are flexible non-parametric models, with a capacity that grows with the available data.

Gaussian Processes

GPyTorch: Blackbox Matrix-Matrix Gaussian Process Inference with GPU Acceleration

cornellius-gp/gpytorch NeurIPS 2018

Despite advances in scalable models, the inference tools used for Gaussian processes (GPs) have yet to fully capitalize on developments in computing hardware.

Gaussian Processes

Product Kernel Interpolation for Scalable Gaussian Processes

cornellius-gp/gpytorch 24 Feb 2018

Recent work shows that inference for Gaussian processes can be performed efficiently using iterative methods that rely only on matrix-vector multiplications (MVMs).

Gaussian Processes

Gaussian Processes for Big Data

cornellius-gp/gpytorch 26 Sep 2013

We introduce stochastic variational inference for Gaussian process models.

Gaussian Processes Latent Variable Models +1

Scale Mixtures of Neural Network Gaussian Processes

google/neural-tangents ICLR 2022

We show that simply introducing a scale prior on the last-layer parameters can turn infinitely-wide neural networks of any architecture into a richer class of stochastic processes.

Gaussian Processes

Scale Mixtures of Neural Network Gaussian Processes

google/neural-tangents 3 Jul 2021

We show that simply introducing a scale prior on the last-layer parameters can turn infinitely-wide neural networks of any architecture into a richer class of stochastic processes.

Gaussian Processes