Search Results for author: Othmane Sebbouh

Found 6 papers, 1 papers with code

Structured Transforms Across Spaces with Cost-Regularized Optimal Transport

no code implementations9 Nov 2023 Othmane Sebbouh, Marco Cuturi, Gabriel Peyré

Matching a source to a target probability measure is often solved by instantiating a linear optimal transport (OT) problem, parameterized by a ground cost function that quantifies discrepancy between points.

Randomized Stochastic Gradient Descent Ascent

no code implementations25 Nov 2021 Othmane Sebbouh, Marco Cuturi, Gabriel Peyré

RSGDA can be parameterized using optimal loop sizes that guarantee the best convergence rates known to hold for SGDA.

Unified Analysis of Stochastic Gradient Methods for Composite Convex and Smooth Optimization

no code implementations20 Jun 2020 Ahmed Khaled, Othmane Sebbouh, Nicolas Loizou, Robert M. Gower, Peter Richtárik

We showcase this by obtaining a simple formula for the optimal minibatch size of two variance reduced methods (\textit{L-SVRG} and \textit{SAGA}).

Quantization

SGD for Structured Nonconvex Functions: Learning Rates, Minibatching and Interpolation

no code implementations18 Jun 2020 Robert M. Gower, Othmane Sebbouh, Nicolas Loizou

Stochastic Gradient Descent (SGD) is being used routinely for optimizing non-convex functions.

Almost sure convergence rates for Stochastic Gradient Descent and Stochastic Heavy Ball

no code implementations14 Jun 2020 Othmane Sebbouh, Robert M. Gower, Aaron Defazio

We show that these results still hold when using stochastic line search and stochastic Polyak stepsizes, thereby giving the first proof of convergence of these methods in the non-overparametrized regime.

Towards closing the gap between the theory and practice of SVRG

1 code implementation NeurIPS 2019 Othmane Sebbouh, Nidham Gazagnadou, Samy Jelassi, Francis Bach, Robert M. Gower

Among the very first variance reduced stochastic methods for solving the empirical risk minimization problem was the SVRG method (Johnson & Zhang 2013).

Cannot find the paper you are looking for? You can Submit a new open access paper.