no code implementations • 9 Nov 2023 • Othmane Sebbouh, Marco Cuturi, Gabriel Peyré
Matching a source to a target probability measure is often solved by instantiating a linear optimal transport (OT) problem, parameterized by a ground cost function that quantifies discrepancy between points.
no code implementations • 25 Nov 2021 • Othmane Sebbouh, Marco Cuturi, Gabriel Peyré
RSGDA can be parameterized using optimal loop sizes that guarantee the best convergence rates known to hold for SGDA.
no code implementations • 20 Jun 2020 • Ahmed Khaled, Othmane Sebbouh, Nicolas Loizou, Robert M. Gower, Peter Richtárik
We showcase this by obtaining a simple formula for the optimal minibatch size of two variance reduced methods (\textit{L-SVRG} and \textit{SAGA}).
no code implementations • 18 Jun 2020 • Robert M. Gower, Othmane Sebbouh, Nicolas Loizou
Stochastic Gradient Descent (SGD) is being used routinely for optimizing non-convex functions.
no code implementations • 14 Jun 2020 • Othmane Sebbouh, Robert M. Gower, Aaron Defazio
We show that these results still hold when using stochastic line search and stochastic Polyak stepsizes, thereby giving the first proof of convergence of these methods in the non-overparametrized regime.
1 code implementation • NeurIPS 2019 • Othmane Sebbouh, Nidham Gazagnadou, Samy Jelassi, Francis Bach, Robert M. Gower
Among the very first variance reduced stochastic methods for solving the empirical risk minimization problem was the SVRG method (Johnson & Zhang 2013).