Bayesian Optimisation

56 papers with code • 0 benchmarks • 0 datasets

Expensive black-box functions are a common problem in many disciplines, including tuning the parameters of machine learning algorithms, robotics, and other engineering design problems. Bayesian Optimisation is a principled and efficient technique for the global optimisation of these functions. The idea behind Bayesian Optimisation is to place a prior distribution over the target function and then update that prior with a set of “true” observations of the target function by expensively evaluating it in order to produce a posterior predictive distribution. The posterior then informs where to make the next observation of the target function through the use of an acquisition function, which balances the exploitation of regions known to have good performance with the exploration of regions where there is little information about the function’s response.

Source: A Bayesian Approach for the Robust Optimisation of Expensive-to-Evaluate Functions

Libraries

Use these libraries to find Bayesian Optimisation models and implementations
4 papers
248

Most implemented papers

Max-value Entropy Search for Efficient Bayesian Optimization

zi-w/Max-value-Entropy-Search ICML 2017

We propose a new criterion, Max-value Entropy Search (MES), that instead uses the information about the maximum function value.

An Empirical Study of Assumptions in Bayesian Optimisation

huawei-noah/noah-research 7 Dec 2020

Our results on the Bayesmark benchmark indicate that heteroscedasticity and non-stationarity pose significant challenges for black-box optimisers.

Bayesian optimisation for fast approximate inference in state-space models with intractable likelihoods

compops/gpo-smc-abc 23 Jun 2015

We consider the problem of approximate Bayesian parameter inference in non-linear state-space models with intractable likelihoods.

Bayesian Optimisation over Multiple Continuous and Categorical Inputs

rubinxin/CoCaBO_code ICML 2020

Efficient optimisation of black-box problems that comprise both continuous and categorical inputs is important, yet poses significant challenges.

On the Expressiveness of Approximate Inference in Bayesian Neural Networks

cambridge-mlg/expressiveness-approx-bnns NeurIPS 2020

While Bayesian neural networks (BNNs) hold the promise of being flexible, well-calibrated statistical models, inference often requires approximations whose consequences are poorly understood.

Neural Architecture Generator Optimization

huawei-noah/vega NeurIPS 2020

Neural Architecture Search (NAS) was first proposed to achieve state-of-the-art performance through the discovery of new architecture patterns, without human intervention.

High-Dimensional Bayesian Optimisation with Variational Autoencoders and Deep Metric Learning

huawei-noah/noah-research 7 Jun 2021

We introduce a method combining variational autoencoders (VAEs) and deep metric learning to perform Bayesian optimisation (BO) over high-dimensional and structured input spaces.

Theoretical Analysis of Bayesian Optimisation with Unknown Gaussian Process Hyper-Parameters

weepingwillowben/hyperparam-opt 30 Jun 2014

Bayesian optimisation has gained great popularity as a tool for optimising the parameters of machine learning algorithms and models.

Batch Bayesian Optimization via Local Penalization

SheffieldML/GPyOpt 29 May 2015

The approach assumes that the function of interest, $f$, is a Lipschitz continuous function.

Asynchronous Parallel Bayesian Optimisation via Thompson Sampling

kirthevasank/gp-parallel-ts 25 May 2017

We design and analyse variations of the classical Thompson sampling (TS) procedure for Bayesian optimisation (BO) in settings where function evaluations are expensive, but can be performed in parallel.