Bayesian Optimisation

84 papers with code • 0 benchmarks • 0 datasets

Expensive black-box functions are a common problem in many disciplines, including tuning the parameters of machine learning algorithms, robotics, and other engineering design problems. Bayesian Optimisation is a principled and efficient technique for the global optimisation of these functions. The idea behind Bayesian Optimisation is to place a prior distribution over the target function and then update that prior with a set of “true” observations of the target function by expensively evaluating it in order to produce a posterior predictive distribution. The posterior then informs where to make the next observation of the target function through the use of an acquisition function, which balances the exploitation of regions known to have good performance with the exploration of regions where there is little information about the function’s response.

Source: A Bayesian Approach for the Robust Optimisation of Expensive-to-Evaluate Functions

Libraries

Use these libraries to find Bayesian Optimisation models and implementations
6 papers
2,929

Most implemented papers

End-to-End Meta-Bayesian Optimisation with Transformer Neural Processes

huawei-noah/HEBO NeurIPS 2023

We enable this end-to-end framework with reinforcement learning (RL) to tackle the lack of labelled acquisition data.

Cheetah: Bridging the Gap Between Machine Learning and Particle Accelerator Physics with High-Speed, Differentiable Simulations

desy-ml/cheetah 11 Jan 2024

Machine learning has emerged as a powerful solution to the modern challenges in accelerator physics.

Theoretical Analysis of Bayesian Optimisation with Unknown Gaussian Process Hyper-Parameters

weepingwillowben/hyperparam-opt 30 Jun 2014

Bayesian optimisation has gained great popularity as a tool for optimising the parameters of machine learning algorithms and models.

Batch Bayesian Optimization via Local Penalization

SheffieldML/GPyOpt 29 May 2015

The approach assumes that the function of interest, $f$, is a Lipschitz continuous function.

Asynchronous Parallel Bayesian Optimisation via Thompson Sampling

kirthevasank/gp-parallel-ts 25 May 2017

We design and analyse variations of the classical Thompson sampling (TS) procedure for Bayesian optimisation (BO) in settings where function evaluations are expensive, but can be performed in parallel.

Generalising Random Forest Parameter Optimisation to Include Stability and Cost

liuchbryan/generalised_forest_tuning 29 Jun 2017

We argue that error reduction is only one of several metrics that must be considered when optimizing random forest parameters for commercial applications.

Fast Information-theoretic Bayesian Optimisation

rubinxin/FITBO ICML 2018

Information-theoretic Bayesian optimisation techniques have demonstrated state-of-the-art performance in tackling important global optimisation problems.

GPflowOpt: A Bayesian Optimization Library using TensorFlow

GPflow/GPflowOpt 10 Nov 2017

A novel Python framework for Bayesian optimization known as GPflowOpt is introduced.

Neural Architecture Search with Bayesian Optimisation and Optimal Transport

kirthevasank/nasbot NeurIPS 2018

A common use case for BO in machine learning is model selection, where it is not possible to analytically model the generalisation performance of a statistical model, and we resort to noisy and expensive training and validation procedures to choose the best model.

Hyperparameter Learning via Distributional Transfer

hcllaw/distBO NeurIPS 2019

Bayesian optimisation is a popular technique for hyperparameter learning but typically requires initial exploration even in cases where similar prior tasks have been solved.