Bayesian Optimisation
69 papers with code • 0 benchmarks • 0 datasets
Expensive black-box functions are a common problem in many disciplines, including tuning the parameters of machine learning algorithms, robotics, and other engineering design problems. Bayesian Optimisation is a principled and efficient technique for the global optimisation of these functions. The idea behind Bayesian Optimisation is to place a prior distribution over the target function and then update that prior with a set of “true” observations of the target function by expensively evaluating it in order to produce a posterior predictive distribution. The posterior then informs where to make the next observation of the target function through the use of an acquisition function, which balances the exploitation of regions known to have good performance with the exploration of regions where there is little information about the function’s response.
Source: A Bayesian Approach for the Robust Optimisation of Expensive-to-Evaluate Functions
Benchmarks
These leaderboards are used to track progress in Bayesian Optimisation
Libraries
Use these libraries to find Bayesian Optimisation models and implementationsMost implemented papers
Max-value Entropy Search for Efficient Bayesian Optimization
We propose a new criterion, Max-value Entropy Search (MES), that instead uses the information about the maximum function value.
HEBO Pushing The Limits of Sample-Efficient Hyperparameter Optimisation
Our results on the Bayesmark benchmark indicate that heteroscedasticity and non-stationarity pose significant challenges for black-box optimisers.
Developing Optimal Causal Cyber-Defence Agents via Cyber Security Simulation
In this paper we explore cyber security defence, through the unification of a novel cyber security simulator with models for (causal) decision-making through optimisation.
Bayesian optimisation for fast approximate inference in state-space models with intractable likelihoods
We consider the problem of approximate Bayesian parameter inference in non-linear state-space models with intractable likelihoods.
Bayesian Optimisation over Multiple Continuous and Categorical Inputs
Efficient optimisation of black-box problems that comprise both continuous and categorical inputs is important, yet poses significant challenges.
On the Expressiveness of Approximate Inference in Bayesian Neural Networks
While Bayesian neural networks (BNNs) hold the promise of being flexible, well-calibrated statistical models, inference often requires approximations whose consequences are poorly understood.
BoTorch: A Framework for Efficient Monte-Carlo Bayesian Optimization
Bayesian optimization provides sample-efficient global optimization for a broad range of applications, including automatic machine learning, engineering, physics, and experimental design.
Neural Architecture Generator Optimization
Neural Architecture Search (NAS) was first proposed to achieve state-of-the-art performance through the discovery of new architecture patterns, without human intervention.
High-Dimensional Bayesian Optimisation with Variational Autoencoders and Deep Metric Learning
We introduce a method combining variational autoencoders (VAEs) and deep metric learning to perform Bayesian optimisation (BO) over high-dimensional and structured input spaces.
Are Random Decompositions all we need in High Dimensional Bayesian Optimisation?
Learning decompositions of expensive-to-evaluate black-box functions promises to scale Bayesian optimisation (BO) to high-dimensional problems.