no code implementations • 22 May 2023 • Evgenii Chzhen, Sholom Schechtman
The core idea is first instantiated on the problem of minimizing sums of convex and Lipschitz functions and is then extended to the smooth case via variance reduction.
no code implementations • 16 Mar 2023 • Sholom Schechtman, Daniil Tiapkin, Michael Muehlebach, Eric Moulines
We consider the problem of minimizing a non-convex function over a smooth manifold $\mathcal{M}$.
no code implementations • 7 Nov 2022 • Louis Leconte, Sholom Schechtman, Eric Moulines
First, we formulate the training of quantized neural networks (QNNs) as a smoothed sequence of interval-constrained optimization problems.
no code implementations • 6 Sep 2021 • Sholom Schechtman
It was previously shown by Davis and Drusvyatskiy that every Clarke critical point of a generic, semialgebraic (and more generally definable in an o-minimal structure), weakly convex function is lying on an active manifold and is either a local minimum or an active strict saddle.
no code implementations • 4 Aug 2021 • Pascal Bianchi, Walid Hachem, Sholom Schechtman
Consequently, generically in the class of definable weakly convex functions, the SGD converges to a local minimizer.