Search Results for author: El Mahdi Chayti

Found 4 papers, 2 papers with code

Unified Convergence Theory of Stochastic and Variance-Reduced Cubic Newton Methods

no code implementations23 Feb 2023 El Mahdi Chayti, Nikita Doikov, Martin Jaggi

Our helper framework offers the algorithm designer high flexibility for constructing and analyzing the stochastic Cubic Newton methods, allowing arbitrary size batches, and the use of noisy and possibly biased estimates of the gradients and Hessians, incorporating both the variance reduction and the lazy Hessian updates.

Auxiliary Learning

Second-order optimization with lazy Hessians

no code implementations1 Dec 2022 Nikita Doikov, El Mahdi Chayti, Martin Jaggi

This provably improves the total arithmetical complexity of second-order algorithms by a factor $\sqrt{d}$.

Optimization with Access to Auxiliary Information

1 code implementation1 Jun 2022 El Mahdi Chayti, Sai Praneeth Karimireddy

We investigate the fundamental optimization question of minimizing a target function $f$, whose gradients are expensive to compute or have limited availability, given access to some auxiliary side function $h$ whose gradients are cheap or more available.

Federated Learning Transfer Learning

Linear Speedup in Personalized Collaborative Learning

1 code implementation10 Nov 2021 El Mahdi Chayti, Sai Praneeth Karimireddy, Sebastian U. Stich, Nicolas Flammarion, Martin Jaggi

Collaborative training can improve the accuracy of a model for a user by trading off the model's bias (introduced by using data from other users who are potentially different) against its variance (due to the limited amount of data on any single user).

Federated Learning Stochastic Optimization

Cannot find the paper you are looking for? You can Submit a new open access paper.