no code implementations • 19 Feb 2024 • Tam Le, Jérôme Malick
Distributionally robust optimization has emerged as an attractive way to train robust machine learning models, capturing data uncertainty and distribution shifts.
no code implementations • 15 Nov 2022 • Waïss Azizian, Franck Iutzeler, Jérôme Malick, Panayotis Mertikopoulos
For generality, we focus on local solutions of constrained, non-monotone variational inequalities, and we show that the convergence rate of a given method depends sharply on its associated Legendre exponent, a notion that measures the growth rate of the underlying Bregman function (Euclidean, entropic, or other) near a solution.
no code implementations • 8 Jun 2022 • Yu-Guan Hsieh, Yassine Laguel, Franck Iutzeler, Jérôme Malick
We consider decentralized optimization problems in which a number of agents collaborate to minimize the average of their local functions by exchanging over an underlying communication graph.
1 code implementation • 17 Dec 2021 • Krishna Pillutla, Yassine Laguel, Jérôme Malick, Zaid Harchaoui
We present a federated learning framework that is designed to robustly deliver good predictive performance across individual clients with heterogeneous data.
no code implementations • 5 Jul 2021 • Waïss Azizian, Franck Iutzeler, Jérôme Malick, Panayotis Mertikopoulos
In this paper, we analyze the local convergence rate of optimistic mirror descent methods in stochastic variational inequalities, a class of optimization problems with important applications to learning theory and machine learning.
no code implementations • 27 May 2021 • Yu-Guan Hsieh, Franck Iutzeler, Jérôme Malick, Panayotis Mertikopoulos
In networks of autonomous agents (e. g., fleets of vehicles, scattered sensors), the problem of minimizing the sum of the agents' local functions has received a lot of interest.
no code implementations • 21 Dec 2020 • Yu-Guan Hsieh, Franck Iutzeler, Jérôme Malick, Panayotis Mertikopoulos
In this paper, we provide a general framework for studying multi-agent online learning problems in the presence of delays and asynchronicities.
no code implementations • 2 Oct 2020 • Franck Iutzeler, Jérôme Malick
Nonsmoothness is often a curse for optimization; but it is sometimes a blessing, in particular for applications in machine learning.
1 code implementation • 30 Sep 2020 • Yassine Laguel, Jérôme Malick, Zaid Harchaoui
Classical supervised learning via empirical risk (or negative log-likelihood) minimization hinges upon the assumption that the testing distribution coincides with the training distribution.
no code implementations • NeurIPS 2020 • Yu-Guan Hsieh, Franck Iutzeler, Jérôme Malick, Panayotis Mertikopoulos
Owing to their stability and convergence speed, extragradient methods have become a staple for solving large-scale saddle-point problems in machine learning.
1 code implementation • arXiv preprint 2020 • Yassine Laguel, Krishna Pillutla, Jérôme Malick, Zaid Harchaoui
We propose a federated learning framework to handle heterogeneous client devices which do not conform to the population data distribution.
no code implementations • NeurIPS 2019 • Yu-Guan Hsieh, Franck Iutzeler, Jérôme Malick, Panayotis Mertikopoulos
Variational inequalities have recently attracted considerable interest in machine learning as a flexible paradigm for models that go beyond ordinary loss function minimization (such as generative adversarial networks and related deep learning systems).
no code implementations • ICML 2018 • Konstantin Mishchenko, Franck Iutzeler, Jérôme Malick, Massih-Reza Amini
One of the main challenges is then to deal with heterogeneous machines and unreliable communications.
no code implementations • 25 Jun 2018 • Konstantin Mishchenko, Franck Iutzeler, Jérôme Malick
We develop and analyze an asynchronous algorithm for distributed convex optimization when the objective writes a sum of smooth functions, local to each worker, and a non-smooth function.
1 code implementation • 11 Jul 2017 • Jalal Fadili, Jérôme Malick, Gabriel Peyré
This pairing is crucial to track the strata that are identifiable by solutions of parametrized optimization problems or by iterates of optimization algorithms.