Search Results for author: Amit Attia

Found 5 papers, 0 papers with code

A Note on High-Probability Analysis of Algorithms with Exponential, Sub-Gaussian, and General Light Tails

no code implementations5 Mar 2024 Amit Attia, Tomer Koren

This short note describes a simple technique for analyzing probabilistic algorithms that rely on a light-tailed (but not necessarily bounded) source of randomization.

Stochastic Optimization

How Free is Parameter-Free Stochastic Optimization?

no code implementations5 Feb 2024 Amit Attia, Tomer Koren

We study the problem of parameter-free stochastic optimization, inquiring whether, and under what conditions, do fully parameter-free methods exist: these are methods that achieve convergence rates competitive with optimally tuned methods, without requiring significant knowledge of the true problem parameters.

Stochastic Optimization

SGD with AdaGrad Stepsizes: Full Adaptivity with High Probability to Unknown Parameters, Unbounded Gradients and Affine Variance

no code implementations17 Feb 2023 Amit Attia, Tomer Koren

We study Stochastic Gradient Descent with AdaGrad stepsizes: a popular adaptive (self-tuning) method for first-order stochastic optimization.

Stochastic Optimization

Uniform Stability for First-Order Empirical Risk Minimization

no code implementations17 Jul 2022 Amit Attia, Tomer Koren

We consider the problem of designing uniformly stable first-order optimization algorithms for empirical risk minimization.

Algorithmic Instabilities of Accelerated Gradient Descent

no code implementations NeurIPS 2021 Amit Attia, Tomer Koren

For convex quadratic objectives, Chen et al. (2018) proved that the uniform stability of the method grows quadratically with the number of optimization steps, and conjectured that the same is true for the general convex and smooth case.

Cannot find the paper you are looking for? You can Submit a new open access paper.