no code implementations • 3 Nov 2023 • Natalie Maus, Zhiyuan Jerry Lin, Maximilian Balandat, Eytan Bakshy
To effectively tackle these challenges, we introduce Joint Composite Latent Space Bayesian Optimization (JoCo), a novel framework that jointly trains neural network encoders and probabilistic models to adaptively compress high-dimensional input and output spaces into manageable latent representations.
no code implementations • NeurIPS 2023 • Sebastian Ament, Samuel Daulton, David Eriksson, Maximilian Balandat, Eytan Bakshy
Expected Improvement (EI) is arguably the most popular acquisition function in Bayesian optimization and has found countless successful applications, but its performance is often exceeded by that of more recent methods.
no code implementations • 30 Mar 2023 • Mia Garrard, Hanson Wang, Ben Letham, Shaun Singh, Abbas Kazerouni, Sarah Tan, Zehui Wang, Yin Huang, Yichun Hu, Chad Zhou, Norm Zhou, Eytan Bakshy
Many organizations measure treatment effects via an experimentation platform to evaluate the casual effect of product variations prior to full-scale deployment.
1 code implementation • 28 Mar 2023 • Raul Astudillo, Zhiyuan Jerry Lin, Eytan Bakshy, Peter I. Frazier
Preferential Bayesian optimization (PBO) is a framework for optimizing a decision maker's latent utility function using preference feedback.
1 code implementation • 3 Mar 2023 • Aryan Deshwal, Sebastian Ament, Maximilian Balandat, Eytan Bakshy, Janardhan Rao Doppa, David Eriksson
We use Bayesian Optimization (BO) and propose a novel surrogate modeling approach for efficiently handling a large number of binary and categorical parameters.
2 code implementations • 18 Oct 2022 • Samuel Daulton, Xingchen Wan, David Eriksson, Maximilian Balandat, Michael A. Osborne, Eytan Bakshy
We prove that under suitable reparameterizations, the BO policy that maximizes the probabilistic objective is the same as that which maximizes the AF, and therefore, PR enjoys the same regret bounds as the original BO policy using the underlying AF.
1 code implementation • 21 Mar 2022 • Zhiyuan Jerry Lin, Raul Astudillo, Peter I. Frazier, Eytan Bakshy
We consider Bayesian optimization of expensive-to-evaluate experiments that generate vector-valued outcomes over which a decision-maker (DM) has preferences.
1 code implementation • 18 Mar 2022 • Benjamin Letham, Phillip Guan, Chase Tymms, Eytan Bakshy, Michael Shvartsman
We demonstrate a clear benefit to using this new class of acquisition functions on benchmark problems, and on a challenging real-world task of estimating a high-dimensional contrast sensitivity function.
1 code implementation • 3 Mar 2022 • Sulin Liu, Qing Feng, David Eriksson, Benjamin Letham, Eytan Bakshy
Bayesian optimization (BO) is a powerful approach to sample-efficient optimization of black-box objective functions.
1 code implementation • 15 Feb 2022 • Samuel Daulton, Sait Cakmak, Maximilian Balandat, Michael A. Osborne, Enlu Zhou, Eytan Bakshy
In many manufacturing processes, the design parameters are subject to random input noise, resulting in a product that is often less performant than expected.
1 code implementation • NeurIPS 2021 • Raul Astudillo, Daniel R. Jiang, Maximilian Balandat, Eytan Bakshy, Peter I. Frazier
To overcome the shortcomings of existing approaches, we propose the budgeted multi-step expected improvement, a non-myopic acquisition function that generalizes classical expected improvement to the setting of heterogeneous and unknown evaluation costs.
no code implementations • 5 Nov 2021 • Han Wu, Sarah Tan, Weiwei Li, Mia Garrard, Adam Obeng, Drew Dimmery, Shaun Singh, Hanson Wang, Daniel Jiang, Eytan Bakshy
Black-box heterogeneous treatment effect (HTE) models are increasingly being used to create personalized policies that assign individuals to their optimal treatments.
no code implementations • 14 Oct 2021 • Igor L. Markov, Hanson Wang, Nitya Kasturi, Shaun Singh, Sze Wai Yuen, Mia Garrard, Sarah Tran, Yin Huang, Zehui Wang, Igor Glotov, Tanvi Gupta, Boshuang Huang, Peng Chen, Xiaowen Xie, Michael Belkin, Sal Uryasev, Sam Howie, Eytan Bakshy, Norm Zhou
Modern software systems and products increasingly rely on machine learning models to make data-driven decisions based on interactions with users, infrastructure and other systems.
no code implementations • 22 Sep 2021 • Samuel Daulton, David Eriksson, Maximilian Balandat, Eytan Bakshy
Many real world scientific and industrial applications require optimizing multiple competing black-box objectives.
2 code implementations • NeurIPS 2021 • Wesley J. Maddox, Maximilian Balandat, Andrew Gordon Wilson, Eytan Bakshy
However, the Gaussian Process (GP) models typically used as probabilistic surrogates for multi-task Bayesian Optimization scale poorly with the number of outcomes, greatly limiting applicability.
1 code implementation • NeurIPS 2021 • Samuel Daulton, Maximilian Balandat, Eytan Bakshy
We argue that, even in the noiseless setting, generating multiple candidates in parallel is an incarnation of EHVI with uncertainty in the Pareto frontier and therefore can be addressed using the same underlying technique.
1 code implementation • NeurIPS 2020 • Qing Feng , Ben Letham, Hongzi Mao, Eytan Bakshy
Contextual policies are used in many settings to customize system parameters and actions to the specifics of a particular setting.
no code implementations • 29 Nov 2020 • Hongseok Namkoong, Samuel Daulton, Eytan Bakshy
We propose a novel imitation-learning-based algorithm that distills a TS policy into an explicit policy representation by performing posterior inference and optimization offline.
no code implementations • 28 Aug 2020 • Hongzi Mao, Shannon Chen, Drew Dimmery, Shaun Singh, Drew Blaisdell, Yuandong Tian, Mohammad Alizadeh, Eytan Bakshy
Client-side video players employ adaptive bitrate (ABR) algorithms to optimize user quality of experience (QoE).
1 code implementation • NeurIPS 2020 • Samuel Daulton, Maximilian Balandat, Eytan Bakshy
In many real-world scenarios, decision makers seek to efficiently optimize multiple competing objectives in a sample-efficient fashion.
1 code implementation • NeurIPS 2020 • Benjamin Letham, Roberto Calandra, Akshara Rai, Eytan Bakshy
We show empirically that properly addressing these issues significantly improves the efficacy of linear embeddings for BO on a range of problems, including learning a gait policy for robot locomotion.
no code implementations • 2 Nov 2019 • Samuel Daulton, Shaun Singh, Vashist Avadhanula, Drew Dimmery, Eytan Bakshy
Real-world applications frequently have constraints with respect to a currently deployed policy.
2 code implementations • NeurIPS 2020 • Maximilian Balandat, Brian Karrer, Daniel R. Jiang, Samuel Daulton, Benjamin Letham, Andrew Gordon Wilson, Eytan Bakshy
Bayesian optimization provides sample-efficient global optimization for a broad range of applications, including automatic machine learning, engineering, physics, and experimental design.
no code implementations • 1 Apr 2019 • Benjamin Letham, Eytan Bakshy
To alleviate these constraints, we augment online experiments with an offline simulator and apply multi-task Bayesian optimization to tune live machine learning systems.
2 code implementations • 6 Feb 2018 • Matthias Feurer, Benjamin Letham, Frank Hutter, Eytan Bakshy
When hyperparameter optimization of a machine learning algorithm is repeated for multiple datasets it is possible to transfer knowledge to an optimization run on a new dataset.
no code implementations • 21 Jun 2017 • Benjamin Letham, Brian Karrer, Guilherme Ottoni, Eytan Bakshy
Randomized experiments are the gold standard for evaluating the effects of changes to real-world systems.
1 code implementation • 14 Jun 2017 • Dean Eckles, Eytan Bakshy
Naive observational estimators overstate peer effects by 320% and commonly used variables (e. g., demographics) offer little bias reduction, but adjusting for a measure of prior behaviors closely related to the focal behavior reduces bias by 91%.
no code implementations • 19 Jun 2012 • Eytan Bakshy, Dean Eckles, Rong Yan, Itamar Rosenn
This approach can increase ad efficacy for two main reasons: peers' affiliations reflect unobserved consumer characteristics, which are correlated along the social network; and the inclusion of social cues (i. e., peers' association with a brand) alongside ads affect responses via social influence processes.
Social and Information Networks Physics and Society Applications J.4; H.1.2