Search Results for author: Sholom Schechtman

Found 5 papers, 0 papers with code

SignSVRG: fixing SignSGD via variance reduction

no code implementations22 May 2023 Evgenii Chzhen, Sholom Schechtman

The core idea is first instantiated on the problem of minimizing sums of convex and Lipschitz functions and is then extended to the smooth case via variance reduction.

AskewSGD : An Annealed interval-constrained Optimisation method to train Quantized Neural Networks

no code implementations7 Nov 2022 Louis Leconte, Sholom Schechtman, Eric Moulines

First, we formulate the training of quantized neural networks (QNNs) as a smoothed sequence of interval-constrained optimization problems.

Stochastic Subgradient Descent on a Generic Definable Function Converges to a Minimizer

no code implementations6 Sep 2021 Sholom Schechtman

It was previously shown by Davis and Drusvyatskiy that every Clarke critical point of a generic, semialgebraic (and more generally definable in an o-minimal structure), weakly convex function is lying on an active manifold and is either a local minimum or an active strict saddle.

Stochastic Subgradient Descent Escapes Active Strict Saddles on Weakly Convex Functions

no code implementations4 Aug 2021 Pascal Bianchi, Walid Hachem, Sholom Schechtman

Consequently, generically in the class of definable weakly convex functions, the SGD converges to a local minimizer.

Stochastic Optimization

Cannot find the paper you are looking for? You can Submit a new open access paper.