Search Results for author: Eytan Bakshy

Found 28 papers, 16 papers with code

Joint Composite Latent Space Bayesian Optimization

no code implementations3 Nov 2023 Natalie Maus, Zhiyuan Jerry Lin, Maximilian Balandat, Eytan Bakshy

To effectively tackle these challenges, we introduce Joint Composite Latent Space Bayesian Optimization (JoCo), a novel framework that jointly trains neural network encoders and probabilistic models to adaptively compress high-dimensional input and output spaces into manageable latent representations.

Bayesian Optimization

Unexpected Improvements to Expected Improvement for Bayesian Optimization

no code implementations NeurIPS 2023 Sebastian Ament, Samuel Daulton, David Eriksson, Maximilian Balandat, Eytan Bakshy

Expected Improvement (EI) is arguably the most popular acquisition function in Bayesian optimization and has found countless successful applications, but its performance is often exceeded by that of more recent methods.

Bayesian Optimization

Practical Policy Optimization with Personalized Experimentation

no code implementations30 Mar 2023 Mia Garrard, Hanson Wang, Ben Letham, Shaun Singh, Abbas Kazerouni, Sarah Tan, Zehui Wang, Yin Huang, Yichun Hu, Chad Zhou, Norm Zhou, Eytan Bakshy

Many organizations measure treatment effects via an experimentation platform to evaluate the casual effect of product variations prior to full-scale deployment.

qEUBO: A Decision-Theoretic Acquisition Function for Preferential Bayesian Optimization

1 code implementation28 Mar 2023 Raul Astudillo, Zhiyuan Jerry Lin, Eytan Bakshy, Peter I. Frazier

Preferential Bayesian optimization (PBO) is a framework for optimizing a decision maker's latent utility function using preference feedback.

Bayesian Optimization

Bayesian Optimization over High-Dimensional Combinatorial Spaces via Dictionary-based Embeddings

1 code implementation3 Mar 2023 Aryan Deshwal, Sebastian Ament, Maximilian Balandat, Eytan Bakshy, Janardhan Rao Doppa, David Eriksson

We use Bayesian Optimization (BO) and propose a novel surrogate modeling approach for efficiently handling a large number of binary and categorical parameters.

Bayesian Optimization Vocal Bursts Intensity Prediction

Bayesian Optimization over Discrete and Mixed Spaces via Probabilistic Reparameterization

2 code implementations18 Oct 2022 Samuel Daulton, Xingchen Wan, David Eriksson, Maximilian Balandat, Michael A. Osborne, Eytan Bakshy

We prove that under suitable reparameterizations, the BO policy that maximizes the probabilistic objective is the same as that which maximizes the AF, and therefore, PR enjoys the same regret bounds as the original BO policy using the underlying AF.

Bayesian Optimization

Preference Exploration for Efficient Bayesian Optimization with Multiple Outcomes

1 code implementation21 Mar 2022 Zhiyuan Jerry Lin, Raul Astudillo, Peter I. Frazier, Eytan Bakshy

We consider Bayesian optimization of expensive-to-evaluate experiments that generate vector-valued outcomes over which a decision-maker (DM) has preferences.

Bayesian Optimization

Look-Ahead Acquisition Functions for Bernoulli Level Set Estimation

1 code implementation18 Mar 2022 Benjamin Letham, Phillip Guan, Chase Tymms, Eytan Bakshy, Michael Shvartsman

We demonstrate a clear benefit to using this new class of acquisition functions on benchmark problems, and on a challenging real-world task of estimating a high-dimensional contrast sensitivity function.

Sparse Bayesian Optimization

1 code implementation3 Mar 2022 Sulin Liu, Qing Feng, David Eriksson, Benjamin Letham, Eytan Bakshy

Bayesian optimization (BO) is a powerful approach to sample-efficient optimization of black-box objective functions.

Bayesian Optimization Recommendation Systems

Robust Multi-Objective Bayesian Optimization Under Input Noise

1 code implementation15 Feb 2022 Samuel Daulton, Sait Cakmak, Maximilian Balandat, Michael A. Osborne, Enlu Zhou, Eytan Bakshy

In many manufacturing processes, the design parameters are subject to random input noise, resulting in a product that is often less performant than expected.

Bayesian Optimization

Multi-Step Budgeted Bayesian Optimization with Unknown Evaluation Costs

1 code implementation NeurIPS 2021 Raul Astudillo, Daniel R. Jiang, Maximilian Balandat, Eytan Bakshy, Peter I. Frazier

To overcome the shortcomings of existing approaches, we propose the budgeted multi-step expected improvement, a non-myopic acquisition function that generalizes classical expected improvement to the setting of heterogeneous and unknown evaluation costs.

Bayesian Optimization

Interpretable Personalized Experimentation

no code implementations5 Nov 2021 Han Wu, Sarah Tan, Weiwei Li, Mia Garrard, Adam Obeng, Drew Dimmery, Shaun Singh, Hanson Wang, Daniel Jiang, Eytan Bakshy

Black-box heterogeneous treatment effect (HTE) models are increasingly being used to create personalized policies that assign individuals to their optimal treatments.

Looper: An end-to-end ML platform for product decisions

no code implementations14 Oct 2021 Igor L. Markov, Hanson Wang, Nitya Kasturi, Shaun Singh, Sze Wai Yuen, Mia Garrard, Sarah Tran, Yin Huang, Zehui Wang, Igor Glotov, Tanvi Gupta, Boshuang Huang, Peng Chen, Xiaowen Xie, Michael Belkin, Sal Uryasev, Sam Howie, Eytan Bakshy, Norm Zhou

Modern software systems and products increasingly rely on machine learning models to make data-driven decisions based on interactions with users, infrastructure and other systems.

Decision Making

Bayesian Optimization with High-Dimensional Outputs

2 code implementations NeurIPS 2021 Wesley J. Maddox, Maximilian Balandat, Andrew Gordon Wilson, Eytan Bakshy

However, the Gaussian Process (GP) models typically used as probabilistic surrogates for multi-task Bayesian Optimization scale poorly with the number of outcomes, greatly limiting applicability.

Bayesian Optimization Vocal Bursts Intensity Prediction

Parallel Bayesian Optimization of Multiple Noisy Objectives with Expected Hypervolume Improvement

1 code implementation NeurIPS 2021 Samuel Daulton, Maximilian Balandat, Eytan Bakshy

We argue that, even in the noiseless setting, generating multiple candidates in parallel is an incarnation of EHVI with uncertainty in the Pareto frontier and therefore can be addressed using the same underlying technique.

Bayesian Optimization

High-Dimensional Contextual Policy Search with Unknown Context Rewards using Bayesian Optimization

1 code implementation NeurIPS 2020 Qing Feng , Ben Letham, Hongzi Mao, Eytan Bakshy

Contextual policies are used in many settings to customize system parameters and actions to the specifics of a particular setting.

Bayesian Optimization

Distilled Thompson Sampling: Practical and Efficient Thompson Sampling via Imitation Learning

no code implementations29 Nov 2020 Hongseok Namkoong, Samuel Daulton, Eytan Bakshy

We propose a novel imitation-learning-based algorithm that distills a TS policy into an explicit policy representation by performing posterior inference and optimization offline.

Action Generation Imitation Learning +1

Re-Examining Linear Embeddings for High-Dimensional Bayesian Optimization

1 code implementation NeurIPS 2020 Benjamin Letham, Roberto Calandra, Akshara Rai, Eytan Bakshy

We show empirically that properly addressing these issues significantly improves the efficacy of linear embeddings for BO on a range of problems, including learning a gait policy for robot locomotion.

Bayesian Optimization Misconceptions +1

BoTorch: A Framework for Efficient Monte-Carlo Bayesian Optimization

2 code implementations NeurIPS 2020 Maximilian Balandat, Brian Karrer, Daniel R. Jiang, Samuel Daulton, Benjamin Letham, Andrew Gordon Wilson, Eytan Bakshy

Bayesian optimization provides sample-efficient global optimization for a broad range of applications, including automatic machine learning, engineering, physics, and experimental design.

Experimental Design

Bayesian Optimization for Policy Search via Online-Offline Experimentation

no code implementations1 Apr 2019 Benjamin Letham, Eytan Bakshy

To alleviate these constraints, we augment online experiments with an offline simulator and apply multi-task Bayesian optimization to tune live machine learning systems.

Bayesian Optimization BIG-bench Machine Learning

Practical Transfer Learning for Bayesian Optimization

2 code implementations6 Feb 2018 Matthias Feurer, Benjamin Letham, Frank Hutter, Eytan Bakshy

When hyperparameter optimization of a machine learning algorithm is repeated for multiple datasets it is possible to transfer knowledge to an optimization run on a new dataset.

Bayesian Optimization Gaussian Processes +3

Constrained Bayesian Optimization with Noisy Experiments

no code implementations21 Jun 2017 Benjamin Letham, Brian Karrer, Guilherme Ottoni, Eytan Bakshy

Randomized experiments are the gold standard for evaluating the effects of changes to real-world systems.

Bayesian Optimization

Bias and high-dimensional adjustment in observational studies of peer effects

1 code implementation14 Jun 2017 Dean Eckles, Eytan Bakshy

Naive observational estimators overstate peer effects by 320% and commonly used variables (e. g., demographics) offer little bias reduction, but adjusting for a measure of prior behaviors closely related to the focal behavior reduces bias by 91%.

Causal Inference Vocal Bursts Intensity Prediction

Social Influence in Social Advertising: Evidence from Field Experiments

no code implementations19 Jun 2012 Eytan Bakshy, Dean Eckles, Rong Yan, Itamar Rosenn

This approach can increase ad efficacy for two main reasons: peers' affiliations reflect unobserved consumer characteristics, which are correlated along the social network; and the inclusion of social cues (i. e., peers' association with a brand) alongside ads affect responses via social influence processes.

Social and Information Networks Physics and Society Applications J.4; H.1.2

Cannot find the paper you are looking for? You can Submit a new open access paper.