Search Results for author: Vidhi Lalchand

Found 13 papers, 5 papers with code

Dimensionality Reduction as Probabilistic Inference

no code implementations15 Apr 2023 Aditya Ravuri, Francisco Vargas, Vidhi Lalchand, Neil D. Lawrence

Dimensionality reduction (DR) algorithms compress high-dimensional data into a lower dimensional representation while preserving important features of the data.

Dimensionality Reduction Gaussian Processes +1

Sparse Gaussian Process Hyperparameters: Optimize or Integrate?

no code implementations4 Nov 2022 Vidhi Lalchand, Wessel P. Bruinsma, David R. Burt, Carl E. Rasmussen

In this work we propose an algorithm for sparse Gaussian process regression which leverages MCMC to sample from the hyperparameter posterior within the variational inducing point framework of Titsias (2009).

Model Selection

Kernel Learning for Explainable Climate Science

1 code implementation11 Sep 2022 Vidhi Lalchand, Kenza Tazi, Talay M. Cheema, Richard E. Turner, Scott Hosking

We account for the spatial variation in precipitation with a non-stationary Gibbs kernel parameterised with an input dependent lengthscale.

Gaussian Processes

Generalised Gaussian Process Latent Variable Models (GPLVM) with Stochastic Variational Inference

no code implementations25 Feb 2022 Vidhi Lalchand, Aditya Ravuri, Neil D. Lawrence

We show how this framework is compatible with different latent variable formulations and perform experiments to compare a suite of models.

Benchmarking Dimensionality Reduction +2

Kernel Identification Through Transformers

1 code implementation NeurIPS 2021 Fergus Simpson, Ian Davies, Vidhi Lalchand, Alessandro Vullo, Nicolas Durrande, Carl Rasmussen

Kernel selection plays a central role in determining the performance of Gaussian Process (GP) models, as the chosen kernel determines both the inductive biases and prior support of functions under the GP prior.

regression

Gaussian Process Latent Variable Flows for Massively Missing Data

no code implementations pproximateinference AABI Symposium 2021 Vidhi Lalchand, Aditya Ravuri, Neil D Lawrence

The Bayesian incarnation of the GPLVM uses a variational framework, where the posterior over all unknown quantities is approximated by a well-behaved variational family, a factorised Gaussian.

Dimensionality Reduction Gaussian Processes +2

Marginalised Gaussian Processes with Nested Sampling

1 code implementation NeurIPS 2021 Fergus Simpson, Vidhi Lalchand, Carl Edward Rasmussen

Learning occurs through the optimisation of kernel hyperparameters using the marginal likelihood as the objective.

Gaussian Processes

A meta-algorithm for classification using random recursive tree ensembles: A high energy physics application

no code implementations19 Jan 2020 Vidhi Lalchand

The algorithm proposed in this thesis targets a challenging classification problem in high energy physics - that of improving the statistical significance of the Higgs discovery.

General Classification Meta-Learning

Extracting more from boosted decision trees: A high energy physics case study

no code implementations16 Jan 2020 Vidhi Lalchand

While the decay of Higgs to a pair of tau leptons was established in 2018 (CMS collaboration et al., 2017) at the 4. 9$\sigma$ significance based on the 2016 data taking period, the 2014 public data set continues to serve as a benchmark data set to test the performance of supervised classification schemes.

Meta-Learning

Approximate Inference for Fully Bayesian Gaussian Process Regression

1 code implementation pproximateinference AABI Symposium 2019 Vidhi Lalchand, Carl Edward Rasmussen

An alternative learning procedure is to infer the posterior over hyperparameters in a hierarchical specification of GPs we call \textit{Fully Bayesian Gaussian Process Regression} (GPR).

GPR regression +1

A Fast and Greedy Subset-of-Data (SoD) Scheme for Sparsification in Gaussian processes

no code implementations17 Nov 2018 Vidhi Lalchand, A. C. Faul

In this paper we present a framework for GP training with sequential selection of training data points using an intuitive selection metric.

Gaussian Processes

Cannot find the paper you are looking for? You can Submit a new open access paper.