1 code implementation • 18 Apr 2024 • Masaki Adachi, Satoshi Hayakawa, Martin Jørgensen, Saad Hamid, Harald Oberhauser, Michael A. Osborne
Parallelisation in Bayesian optimisation is a common strategy but faces several challenges: the need for flexibility in acquisition functions and kernel choices, flexibility dealing with discrete and continuous variables simultaneously, model misspecification, and lastly fast massive parallelisation.
no code implementations • 2 Feb 2024 • Juliusz Ziomek, Masaki Adachi, Michael A. Osborne
Previously proposed algorithms with the no-regret property were only able to handle the special case of unknown lengthscales, reproducing kernel Hilbert space norm and applied only to the frequentist case.
1 code implementation • 26 Oct 2023 • Masaki Adachi, Brady Planden, David A. Howey, Michael A. Osborne, Sebastian Orbell, Natalia Ares, Krikamol Muandet, Siu Lun Chau
Like many optimizers, Bayesian optimization often falls short of gaining user trust due to opacity.
1 code implementation • 9 Jun 2023 • Masaki Adachi, Satoshi Hayakawa, Martin Jørgensen, Xingchen Wan, Vu Nguyen, Harald Oberhauser, Michael A. Osborne
Active learning parallelization is widely used, but typically relies on fixing the batch size throughout experimentation.
1 code implementation • 7 Feb 2023 • Joachim Schaeffer, Paul Gasper, Esteban Garcia-Tamayo, Raymond Gasper, Masaki Adachi, Juan Pablo Gaviria-Cardona, Simon Montoya-Bedoya, Anoushka Bhutani, Andrew Schiek, Rhys Goodall, Rolf Findeisen, Richard D. Braatz, Simon Engelke
Automatic identification of an ECM would substantially accelerate the analysis of large sets of EIS data.
1 code implementation • 27 Jan 2023 • Masaki Adachi, Satoshi Hayakawa, Saad Hamid, Martin Jørgensen, Harald Oberhauser, Micheal A. Osborne
Batch Bayesian optimisation and Bayesian quadrature have been shown to be sample-efficient methods of performing optimisation and quadrature where expensive-to-evaluate objective functions can be queried in parallel.
1 code implementation • 28 Oct 2022 • Masaki Adachi, Yannick Kuhn, Birger Horstmann, Arnulf Latz, Michael A. Osborne, David A. Howey
We show that popular model selection criteria, such as root-mean-square error and Bayesian information criterion, can fail to select a parsimonious model in the case of a multimodal posterior.
2 code implementations • 9 Jun 2022 • Masaki Adachi, Satoshi Hayakawa, Martin Jørgensen, Harald Oberhauser, Michael A. Osborne
Empirically, we find that our approach significantly outperforms the sampling efficiency of both state-of-the-art BQ techniques and Nested Sampling in various real-world datasets, including lithium-ion battery analytics.
1 code implementation • NeurIPS Workshop AI4Scien 2021 • Masaki Adachi
A material exploration model based on high-dimensional discrete Bayesian optimization is introduced.