no code implementations • ICML 2020 • Grigory Malinovsky, Dmitry Kovalev, Elnur Gasanov, Laurent Condat, Peter Richtarik
Most algorithms for solving optimization problems or finding saddle points of convex-concave functions are fixed point algorithms.
no code implementations • 14 Mar 2024 • Kai Yi, Georg Meinhardt, Laurent Condat, Peter Richtárik
Federated Learning (FL) has garnered increasing attention due to its unique characteristic of allowing heterogeneous clients to process their private data locally and interact with a central server, while being respectful of privacy.
no code implementations • 7 Mar 2024 • Laurent Condat, Artavazd Maranjyan, Peter Richtárik
In Distributed optimization and Learning, and even more in the modern framework of federated learning, communication, which is slow and costly, is critical.
no code implementations • 12 Oct 2023 • Luyao Guo, Sulaiman A. Alghunaim, Kun Yuan, Laurent Condat, Jinde Cao
We demonstrate that the leading communication complexity of ProxSkip is $\mathcal{O}\left(\frac{p\sigma^2}{n\epsilon^2}\right)$ for non-convex and convex settings, and $\mathcal{O}\left(\frac{p\sigma^2}{n\epsilon}\right)$ for the strongly convex setting, where $n$ represents the number of nodes, $p$ denotes the probability of communication, $\sigma^2$ signifies the level of stochastic noise, and $\epsilon$ denotes the desired accuracy level.
1 code implementation • 19 Jul 2023 • Guillaume Perez, Laurent Condat, Michel Barlaud
In this paper, we introduce a new projection algorithm for the $\ell_{1,\infty}$ norm ball.
1 code implementation • 22 May 2023 • Kai Yi, Laurent Condat, Peter Richtárik
Federated Learning is an evolving machine learning paradigm, in which multiple clients perform computations based on their individual private data, interspersed by communication with a remote server.
1 code implementation • 20 Feb 2023 • Laurent Condat, Ivan Agarský, Grigory Malinovsky, Peter Richtárik
In federated learning, a large number of users collaborate to learn a global model.
no code implementations • 24 Oct 2022 • Laurent Condat, Ivan Agarský, Peter Richtárik
In federated learning, a large number of users are involved in a global learning task, in a collaborative way.
1 code implementation • 3 Sep 2022 • Daniele Picone, Mauro Dalla Mura, Laurent Condat
Novel optical imaging devices allow for hybrid acquisition modalities such as compressed acquisitions with locally different spatial and spectral resolutions captured by a single focal plane array.
1 code implementation • 9 May 2022 • Laurent Condat, Kai Yi, Peter Richtárik
Our general approach works with a new, larger class of compressors, which has two parameters, the bias and the variance, and includes unbiased and biased compressors as particular cases.
no code implementations • 5 Aug 2021 • Laurent Condat
It is common to have to process signals or images whose values are cyclic and can be represented as points on the complex circle, like wrapped phases, angles, orientations, or color hues.
no code implementations • 6 Jun 2021 • Laurent Condat, Peter Richtárik
We propose a generic variance-reduced algorithm, which we call MUltiple RANdomized Algorithm (MURANA), for minimizing a sum of several smooth functions plus a regularizer, in a sequential or distributed manner.
no code implementations • 22 Feb 2021 • Adil Salim, Laurent Condat, Dmitry Kovalev, Peter Richtárik
Optimization problems under affine constraints appear in various areas of machine learning.
Optimization and Control
no code implementations • 7 Oct 2020 • Alyazeed Albasyoni, Mher Safaryan, Laurent Condat, Peter Richtárik
In the average-case analysis, we design a simple compression operator, Spherical Compression, which naturally achieves the lower bound.
no code implementations • 2 Oct 2020 • Laurent Condat, Grigory Malinovsky, Peter Richtárik
We analyze several generic proximal splitting algorithms well suited for large-scale convex nonsmooth optimization.
no code implementations • 3 Apr 2020 • Grigory Malinovsky, Dmitry Kovalev, Elnur Gasanov, Laurent Condat, Peter Richtárik
Most algorithms for solving optimization problems or finding saddle points of convex-concave functions are fixed-point algorithms.
no code implementations • 3 Apr 2020 • Adil Salim, Laurent Condat, Konstantin Mishchenko, Peter Richtárik
We consider minimizing the sum of three convex functions, where the first one F is smooth, the second one is nonsmooth and proximable and the third one is the composition of a nonsmooth proximable function with a linear operator L. This template problem has many applications, for instance, in image processing and machine learning.
no code implementations • 22 Apr 2015 • Jordan Frecon, Nelly Pustelnik, Patrice Abry, Laurent Condat
In the context of change-point detection, addressed by Total Variation minimization strategies, an efficient on-the-fly algorithm has been designed leading to exact solutions for univariate data.