1 code implementation • 3 Mar 2023 • Aryan Deshwal, Sebastian Ament, Maximilian Balandat, Eytan Bakshy, Janardhan Rao Doppa, David Eriksson
We use Bayesian Optimization (BO) and propose a novel surrogate modeling approach for efficiently handling a large number of binary and categorical parameters.
1 code implementation • NeurIPS 2023 • Ryan-Rhys Griffiths, Leo Klarner, Henry B. Moss, Aditya Ravuri, Sang Truong, Samuel Stanton, Gary Tom, Bojana Rankovic, Yuanqi Du, Arian Jamasb, Aryan Deshwal, Julius Schwartz, Austin Tripp, Gregory Kell, Simon Frieder, Anthony Bourached, Alex Chan, Jacob Moss, Chengzhi Guo, Johannes Durholt, Saudamini Chaurasia, Felix Strieth-Kalthoff, Alpha A. Lee, Bingqing Cheng, Alán Aspuru-Guzik, Philippe Schwaller, Jian Tang
By defining such kernels in GAUCHE, we seek to open the door to powerful tools for uncertainty quantification and Bayesian optimisation in chemistry.
1 code implementation • 12 Apr 2022 • Syrine Belakaria, Aryan Deshwal, Nitthilan Kannappan Jayakodi, Janardhan Rao Doppa
We consider the problem of multi-objective (MO) blackbox optimization using expensive function evaluations, where the goal is to approximate the true Pareto set of solutions while minimizing the number of function evaluations.
1 code implementation • 2 Dec 2021 • Aryan Deshwal, Syrine Belakaria, Janardhan Rao Doppa, Dae Hyun Kim
First, BOPS-T employs Gaussian process (GP) surrogate model with Kendall kernels and a Tractable acquisition function optimization approach based on Thompson sampling to select the sequence of permutations for evaluation.
1 code implementation • NeurIPS 2021 • Aryan Deshwal, Janardhan Rao Doppa
The key idea is to define a novel structure-coupled kernel that explicitly integrates the structural information from decoded structures with the learned latent space representation for better surrogate modeling.
4 code implementations • 13 Oct 2021 • Syrine Belakaria, Aryan Deshwal, Janardhan Rao Doppa
We consider the problem of black-box multi-objective optimization (MOO) using expensive function evaluations (also referred to as experiments), where the goal is to approximate the true Pareto set of solutions by minimizing the total resource cost of experiments.
1 code implementation • 8 Jun 2021 • Aryan Deshwal, Syrine Belakaria, Janardhan Rao Doppa
We develop a principled approach for constructing diffusion kernels over hybrid spaces by utilizing the additive kernel formulation, which allows additive interactions of all orders in a tractable manner.
1 code implementation • 14 Dec 2020 • Aryan Deshwal, Syrine Belakaria, Janardhan Rao Doppa
In this paper, we propose an efficient approach referred as Mercer Features for Combinatorial Bayesian Optimization (MerCBO).
no code implementations • 14 Dec 2020 • Aryan Deshwal, Syrine Belakaria, Janardhan Rao Doppa, Alan Fern
We consider the problem of optimizing expensive black-box functions over discrete spaces (e. g., sets, sequences, graphs).
no code implementations • 2 Nov 2020 • Syrine Belakaria, Aryan Deshwal, Janardhan Rao Doppa
The overall goal is to approximate the true Pareto set of solutions by minimizing the resources consumed for function evaluations.
no code implementations • 12 Sep 2020 • Syrine Belakaria, Aryan Deshwal, Janardhan Rao Doppa
The key idea is to select the sequence of input and function approximations for multiple objectives which maximize the information gain per unit cost for the optimal Pareto front.
1 code implementation • 1 Sep 2020 • Syrine Belakaria, Aryan Deshwal, Janardhan Rao Doppa
We consider the problem of constrained multi-objective blackbox optimization using expensive function evaluations, where the goal is to approximate the true Pareto set of solutions satisfying a set of constraints while minimizing the number of function evaluations.
1 code implementation • 18 Aug 2020 • Aryan Deshwal, Syrine Belakaria, Janardhan Rao Doppa
Based on recent advances in submodular relaxation (Ito and Fujimaki, 2016) for solving Binary Quadratic Programs, we study an approach referred as Parametrized Submodular Relaxation (PSR) towards the goal of improving the scalability and accuracy of solving AFO problems for BOCS model.
1 code implementation • 16 Aug 2020 • Syrine Belakaria, Aryan Deshwal, Janardhan Rao Doppa
We consider the problem of constrained multi-objective (MO) blackbox optimization using expensive function evaluations, where the goal is to approximate the true Pareto set of solutions satisfying a set of constraints while minimizing the number of function evaluations.
1 code implementation • NeurIPS 2019 • Syrine Belakaria, Aryan Deshwal, Janardhan Rao Doppa
We consider the problem of multi-objective (MO) blackbox optimization using expensive function evaluations, where the goal is to approximate the true Pareto-set of solutions by minimizing the number of function evaluations.