Search Results for author: Henry Moss

Found 4 papers, 1 papers with code

Fantasizing with Dual GPs in Bayesian Optimization and Active Learning

no code implementations2 Nov 2022 Paul E. Chang, Prakhar Verma, ST John, Victor Picheny, Henry Moss, Arno Solin

Gaussian processes (GPs) are the main surrogate functions used for sequential modelling such as Bayesian Optimization and Active Learning.

Active Learning Bayesian Optimization +1

Scalable Thompson Sampling using Sparse Gaussian Process Models

no code implementations NeurIPS 2021 Sattar Vakili, Henry Moss, Artem Artemev, Vincent Dutordoir, Victor Picheny

We provide theoretical guarantees and show that the drastic reduction in computational complexity of scalable TS can be enjoyed without loss in the regret performance over the standard TS.

Thompson Sampling

Bayesian Quantile and Expectile Optimisation

no code implementations12 Jan 2020 Victor Picheny, Henry Moss, Léonard Torossian, Nicolas Durrande

In this paper, we propose new variational models for Bayesian quantile and expectile regression that are well-suited for heteroscedastic noise settings.

Bayesian Optimisation Gaussian Processes +1

Using J-K-fold Cross Validation To Reduce Variance When Tuning NLP Models

1 code implementation COLING 2018 Henry Moss, David Leslie, Paul Rayson

K-fold cross validation (CV) is a popular method for estimating the true performance of machine learning models, allowing model selection and parameter tuning.

Document Classification General Classification +4

Cannot find the paper you are looking for? You can Submit a new open access paper.