Search Results for author: Idan Amir

Found 5 papers, 0 papers with code

Better Best of Both Worlds Bounds for Bandits with Switching Costs

no code implementations7 Jun 2022 Idan Amir, Guy Azov, Tomer Koren, Roi Livni

We study best-of-both-worlds algorithms for bandits with switching cost, recently addressed by Rouyer, Seldin and Cesa-Bianchi, 2021.

Thinking Outside the Ball: Optimal Learning with Gradient Descent for Generalized Linear Stochastic Convex Optimization

no code implementations27 Feb 2022 Idan Amir, Roi Livni, Nathan Srebro

We consider linear prediction with a convex Lipschitz loss, or more generally, stochastic convex optimization problems of generalized linear form, i. e.~where each instantaneous loss is a scalar convex function of a linear function.

Never Go Full Batch (in Stochastic Convex Optimization)

no code implementations NeurIPS 2021 Idan Amir, Yair Carmon, Tomer Koren, Roi Livni

We study the generalization performance of $\text{full-batch}$ optimization algorithms for stochastic convex optimization: these are first-order methods that only access the exact gradient of the empirical risk (rather than gradients with respect to individual data points), that include a wide range of algorithms such as gradient descent, mirror descent, and their regularized and/or accelerated variants.

SGD Generalizes Better Than GD (And Regularization Doesn't Help)

no code implementations1 Feb 2021 Idan Amir, Tomer Koren, Roi Livni

We give a new separation result between the generalization performance of stochastic gradient descent (SGD) and of full-batch gradient descent (GD) in the fundamental stochastic convex optimization model.

Prediction with Corrupted Expert Advice

no code implementations NeurIPS 2020 Idan Amir, Idan Attias, Tomer Koren, Roi Livni, Yishay Mansour

We revisit the fundamental problem of prediction with expert advice, in a setting where the environment is benign and generates losses stochastically, but the feedback observed by the learner is subject to a moderate adversarial corruption.

Cannot find the paper you are looking for? You can Submit a new open access paper.