Search Results for author: Chansoo Lee

Found 12 papers, 5 papers with code

Open Source Vizier: Distributed Infrastructure and API for Reliable and Flexible Blackbox Optimization

1 code implementation27 Jul 2022 Xingyou Song, Sagi Perel, Chansoo Lee, Greg Kochanski, Daniel Golovin

Vizier is the de-facto blackbox and hyperparameter optimization service across Google, having optimized some of Google's largest products and research efforts.

Hyperparameter Optimization Transfer Learning

Towards Learning Universal Hyperparameter Optimizers with Transformers

1 code implementation26 May 2022 Yutian Chen, Xingyou Song, Chansoo Lee, Zi Wang, Qiuyi Zhang, David Dohan, Kazuya Kawakami, Greg Kochanski, Arnaud Doucet, Marc'Aurelio Ranzato, Sagi Perel, Nando de Freitas

Meta-learning hyperparameter optimization (HPO) algorithms from prior experiments is a promising approach to improve optimization efficiency over objective functions from a similar distribution.

Hyperparameter Optimization Meta-Learning

OmniPred: Language Models as Universal Regressors

1 code implementation22 Feb 2024 Xingyou Song, Oscar Li, Chansoo Lee, Bangding Yang, Daiyi Peng, Sagi Perel, Yutian Chen

Over the broad landscape of experimental design, regression has been a powerful tool to accurately predict the outcome metrics of a system or model given a set of parameters, but has been traditionally restricted to methods which are only applicable to a specific task.

Experimental Design regression

Pre-trained Gaussian processes for Bayesian optimization

4 code implementations16 Sep 2021 Zi Wang, George E. Dahl, Kevin Swersky, Chansoo Lee, Zachary Nado, Justin Gilmer, Jasper Snoek, Zoubin Ghahramani

Contrary to a common expectation that BO is suited to optimizing black-box functions, it actually requires domain knowledge about those functions to deploy BO successfully.

Bayesian Optimization Gaussian Processes

Pre-training helps Bayesian optimization too

1 code implementation7 Jul 2022 Zi Wang, George E. Dahl, Kevin Swersky, Chansoo Lee, Zelda Mariet, Zachary Nado, Justin Gilmer, Jasper Snoek, Zoubin Ghahramani

Contrary to a common belief that BO is suited to optimizing black-box functions, it actually requires domain knowledge on characteristics of those functions to deploy BO successfully.

Bayesian Optimization

Online Learning via the Differential Privacy Lens

no code implementations NeurIPS 2019 Jacob Abernethy, Young Hun Jung, Chansoo Lee, Audra McMillan, Ambuj Tewari

In this paper, we use differential privacy as a lens to examine online learning in both full and partial information settings.

Multi-Armed Bandits

Hardness of Online Sleeping Combinatorial Optimization Problems

no code implementations NeurIPS 2016 Satyen Kale, Chansoo Lee, Dávid Pál

We show that several online combinatorial optimization problems that admit efficient no-regret algorithms become computationally hard in the sleeping setting where a subset of actions becomes unavailable in each round.

Combinatorial Optimization PAC learning

Spectral Smoothing via Random Matrix Perturbations

no code implementations10 Jul 2015 Jacob Abernethy, Chansoo Lee, Ambuj Tewari

Smoothing the maximum eigenvalue function is important for applications in semidefinite optimization and online learning.

Fighting Bandits with a New Kind of Smoothness

no code implementations NeurIPS 2015 Jacob Abernethy, Chansoo Lee, Ambuj Tewari

We define a novel family of algorithms for the adversarial multi-armed bandit problem, and provide a simple analysis technique based on convex smoothing.

Online Linear Optimization via Smoothing

no code implementations23 May 2014 Jacob Abernethy, Chansoo Lee, Abhinav Sinha, Ambuj Tewari

We present a new optimization-theoretic approach to analyzing Follow-the-Leader style algorithms, particularly in the setting where perturbations are used as a tool for regularization.

Gradientless Descent: High-Dimensional Zeroth-Order Optimization

no code implementations ICLR 2020 Daniel Golovin, John Karro, Greg Kochanski, Chansoo Lee, Xingyou Song, Qiuyi Zhang

Zeroth-order optimization is the process of minimizing an objective $f(x)$, given oracle access to evaluations at adaptively chosen inputs $x$.

Vocal Bursts Intensity Prediction

Cannot find the paper you are looking for? You can Submit a new open access paper.