1 code implementation • ICML 2018 • Changyong Oh, Efstratios Gavves, Max Welling
A major challenge in Bayesian Optimization is the boundary issue (Swersky, 2017) where an algorithm spends too many evaluations near the boundary of its search space.
1 code implementation • NeurIPS 2019 • Changyong Oh, Jakub M. Tomczak, Efstratios Gavves, Max Welling
On this combinatorial graph, we propose an ARD diffusion kernel with which the GP is able to model high-order interactions between variables leading to better performance.
2 code implementations • 7 Feb 2019 • Changyong Oh, Kamil Adamczewski, Mijung Park
We propose a new variational family for Bayesian neural networks.
no code implementations • 25 Feb 2021 • Changyong Oh, Efstratios Gavves, Max Welling
In experiments, we demonstrate the improved sample efficiency of GP BO using FM kernels (BO-FM). On synthetic problems and hyperparameter optimization problems, BO-FM outperforms competitors consistently.
1 code implementation • 26 Feb 2021 • Changyong Oh, Roberto Bondesan, Efstratios Gavves, Max Welling
In this work we propose a batch Bayesian optimization method for combinatorial problems on permutations, which is well suited for expensive-to-evaluate objectives.
no code implementations • 18 Jul 2022 • Changyong Oh, Roberto Bondesan, Dana Kianfar, Rehan Ahmed, Rishubh Khurana, Payal Agarwal, Romain Lepert, Mysore Sriram, Max Welling
Macro placement is the problem of placing memory blocks on a chip canvas.