no code implementations • 15 Nov 2022 • Waïss Azizian, Franck Iutzeler, Jérôme Malick, Panayotis Mertikopoulos
For generality, we focus on local solutions of constrained, non-monotone variational inequalities, and we show that the convergence rate of a given method depends sharply on its associated Legendre exponent, a notion that measures the growth rate of the underlying Bregman function (Euclidean, entropic, or other) near a solution.
no code implementations • 8 Jun 2022 • Yu-Guan Hsieh, Yassine Laguel, Franck Iutzeler, Jérôme Malick
We consider decentralized optimization problems in which a number of agents collaborate to minimize the average of their local functions by exchanging over an underlying communication graph.
1 code implementation • 26 Feb 2022 • Aleksandra Burashnikova, Yury Maximov, Marianne Clausel, Charlotte Laclau, Franck Iutzeler, Massih-Reza Amini
This paper is an extended version of [Burashnikova et al., 2021, arXiv: 2012. 06910], where we proposed a theoretically supported sequential strategy for training a large-scale Recommender System (RS) over implicit feedback, mainly in the form of clicks.
no code implementations • 5 Jul 2021 • Waïss Azizian, Franck Iutzeler, Jérôme Malick, Panayotis Mertikopoulos
In this paper, we analyze the local convergence rate of optimistic mirror descent methods in stochastic variational inequalities, a class of optimization problems with important applications to learning theory and machine learning.
no code implementations • 27 May 2021 • Yu-Guan Hsieh, Franck Iutzeler, Jérôme Malick, Panayotis Mertikopoulos
In networks of autonomous agents (e. g., fleets of vehicles, scattered sensors), the problem of minimizing the sum of the agents' local functions has received a lot of interest.
no code implementations • 21 Dec 2020 • Yu-Guan Hsieh, Franck Iutzeler, Jérôme Malick, Panayotis Mertikopoulos
In this paper, we provide a general framework for studying multi-agent online learning problems in the presence of delays and asynchronicities.
no code implementations • 2 Oct 2020 • Franck Iutzeler, Jérôme Malick
Nonsmoothness is often a curse for optimization; but it is sometimes a blessing, in particular for applications in machine learning.
no code implementations • 1 Sep 2020 • Charlotte Laclau, Franck Iutzeler, Ievgen Redko
In this paper, we introduce and formalize a rank-one partitioning learning paradigm that unifies partitioning methods that proceed by summarizing a data set using a single vector that is further used to derive the final clustering partition.
no code implementations • NeurIPS 2020 • Yu-Guan Hsieh, Franck Iutzeler, Jérôme Malick, Panayotis Mertikopoulos
Owing to their stability and convergence speed, extragradient methods have become a staple for solving large-scale saddle-point problems in machine learning.
no code implementations • NeurIPS 2019 • Yu-Guan Hsieh, Franck Iutzeler, Jérôme Malick, Panayotis Mertikopoulos
Variational inequalities have recently attracted considerable interest in machine learning as a flexible paradigm for models that go beyond ordinary loss function minimization (such as generative adversarial networks and related deep learning systems).
no code implementations • ICML 2018 • Konstantin Mishchenko, Franck Iutzeler, Jérôme Malick, Massih-Reza Amini
One of the main challenges is then to deal with heterogeneous machines and unreliable communications.
no code implementations • 25 Jun 2018 • Konstantin Mishchenko, Franck Iutzeler, Jérôme Malick
We develop and analyze an asynchronous algorithm for distributed convex optimization when the objective writes a sum of smooth functions, local to each worker, and a non-smooth function.
no code implementations • 17 Jan 2018 • Franck Iutzeler, Jerome Malick
In this paper, we investigate the attractive properties of the proximal gradient algorithm with inertia.
no code implementations • 22 May 2017 • Bikash Joshi, Franck Iutzeler, Massih-Reza Amini
In many distributed learning problems, the heterogeneous loading of computing machines may harm the overall performance of synchronous strategies.
1 code implementation • NeurIPS 2017 • Bikash Joshi, Massih-Reza Amini, Ioannis Partalas, Franck Iutzeler, Yury Maximov
We address the problem of multi-class classification in the case where the number of classes is very large.