no code implementations • 30 Sep 2022 • Juan Ungredda, Michael Pearce, Juergen Branke
Bayesian optimization is a powerful collection of methods for optimizing stochastic expensive black box functions.
1 code implementation • 7 Jan 2022 • Michael Pearce, Elena A. Erosheva
We propose the Mallows-Binomial model to close this gap, which combines a Mallows' $\phi$ ranking model with Binomial score models through shared parameters that quantify object quality, a consensus ranking, and the level of consensus between judges.
1 code implementation • pproximateinference AABI Symposium 2021 • Metod Jazbec, Michael Pearce, Vincent Fortuin
Variational autoencoders often assume isotropic Gaussian priors and mean-field posteriors, hence do not exploit structure in scenarios where we may expect similarity or consistency across latent variables.
1 code implementation • 26 Oct 2020 • Metod Jazbec, Matthew Ashman, Vincent Fortuin, Michael Pearce, Stephan Mandt, Gunnar Rätsch
Conventional variational autoencoders fail in modeling correlations between data points due to their use of factorized priors.
1 code implementation • 20 Oct 2020 • Matthew Ashman, Jonathan So, Will Tebbutt, Vincent Fortuin, Michael Pearce, Richard E. Turner
Large, multi-dimensional spatio-temporal datasets are omnipresent in modern science and engineering.
no code implementations • 31 May 2020 • Juan Ungredda, Michael Pearce, Juergen Branke
Particularly when performing simulation optimisation to find an optimal solution, the uncertainty in the inputs significantly affects the quality of the found solution.
no code implementations • NeurIPS 2021 • Michael Pearce, Janis Klaise, Matthew Groves
Bayesian optimization is a class of data efficient model based algorithms typically focused on global optimization.
no code implementations • 21 Oct 2019 • Michael Pearce, Matthias Poloczek, Juergen Branke
Bayesian optimization is a powerful tool for expensive stochastic black-box optimization problems such as simulation-based optimization or machine learning hyperparameter tuning.
2 code implementations • NeurIPS 2019 • David Eriksson, Michael Pearce, Jacob R. Gardner, Ryan Turner, Matthias Poloczek
This motivates the design of a local probabilistic approach for global optimization of large-scale high-dimensional problems.