Search Results for author: Sagi Perel

Found 5 papers, 3 papers with code

Open Source Vizier: Distributed Infrastructure and API for Reliable and Flexible Blackbox Optimization

1 code implementation27 Jul 2022 Xingyou Song, Sagi Perel, Chansoo Lee, Greg Kochanski, Daniel Golovin

Vizier is the de-facto blackbox and hyperparameter optimization service across Google, having optimized some of Google's largest products and research efforts.

Hyperparameter Optimization Transfer Learning

Towards Learning Universal Hyperparameter Optimizers with Transformers

1 code implementation26 May 2022 Yutian Chen, Xingyou Song, Chansoo Lee, Zi Wang, Qiuyi Zhang, David Dohan, Kazuya Kawakami, Greg Kochanski, Arnaud Doucet, Marc'Aurelio Ranzato, Sagi Perel, Nando de Freitas

Meta-learning hyperparameter optimization (HPO) algorithms from prior experiments is a promising approach to improve optimization efficiency over objective functions from a similar distribution.

Hyperparameter Optimization Meta-Learning

OmniPred: Language Models as Universal Regressors

1 code implementation22 Feb 2024 Xingyou Song, Oscar Li, Chansoo Lee, Bangding Yang, Daiyi Peng, Sagi Perel, Yutian Chen

Over the broad landscape of experimental design, regression has been a powerful tool to accurately predict the outcome metrics of a system or model given a set of parameters, but has been traditionally restricted to methods which are only applicable to a specific task.

Experimental Design regression

A Generalized Framework for Population Based Training

no code implementations5 Feb 2019 Ang Li, Ola Spyra, Sagi Perel, Valentin Dalibard, Max Jaderberg, Chenjie Gu, David Budden, Tim Harley, Pramod Gupta

Population Based Training (PBT) is a recent approach that jointly optimizes neural network weights and hyperparameters which periodically copies weights of the best performers and mutates hyperparameters during training.

Cannot find the paper you are looking for? You can Submit a new open access paper.