Search Results for author: Constantin Philippenko

Found 5 papers, 4 papers with code

In-depth Analysis of Low-rank Matrix Factorisation in a Federated Setting

1 code implementation13 Sep 2024 Constantin Philippenko, Kevin Scaman, Laurent Massoulié

We provide a linear rate of convergence of the excess loss which depends on $\sigma_{\max} / \sigma_{r}$, where $\sigma_{r}$ is the $r^{\mathrm{th}}$ singular value of the concatenation $\mathbf{S}$ of the matrices $(\mathbf{S}^i)_{i=1}^N$.

Compressed and distributed least-squares regression: convergence rates with applications to Federated Learning

no code implementations2 Aug 2023 Constantin Philippenko, Aymeric Dieuleveut

In this paper, we investigate the impact of compression on stochastic gradient algorithms for machine learning, a technique widely used in distributed and federated learning.

Federated Learning regression

Preserved central model for faster bidirectional compression in distributed settings

2 code implementations NeurIPS 2021 Constantin Philippenko, Aymeric Dieuleveut

To obtain this improvement, we design MCM, an algorithm such that the downlink compression only impacts local models, while the global model is preserved.

Model Compression

Bidirectional compression in heterogeneous settings for distributed or federated learning with partial participation: tight convergence guarantees

1 code implementation25 Jun 2020 Constantin Philippenko, Aymeric Dieuleveut

We introduce a framework - Artemis - to tackle the problem of learning in a distributed or federated setting with communication constraints and device partial participation.

Federated Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.