no code implementations • 27 Aug 2024 • Charlotte Rodriguez, Laura Degioanni, Laetitia Kameni, Richard Vidal, Giovanni Neglia
Monitoring, understanding, and optimizing the energy consumption of Machine Learning (ML) are various reasons why it is necessary to evaluate the energy usage of ML.
1 code implementation • 24 Apr 2023 • Francesco Cremonesi, Marc Vesin, Sergen Cansiz, Yannick Bouillard, Irene Balelli, Lucia Innocenti, Santiago Silva, Samy-Safwan Ayed, Riccardo Taiello, Laetita Kameni, Richard Vidal, Fanny Orlhac, Christophe Nioche, Nathan Lapel, Bastien Houis, Romain Modzelewski, Olivier Humbert, Melek Önen, Marco Lorenzi
The real-world implementation of federated learning is complex and requires research and development actions at the crossroad between different domains ranging from data science, to software programming, networking, and security.
1 code implementation • 4 Jan 2023 • Othmane Marfoq, Giovanni Neglia, Laetitia Kameni, Richard Vidal
Federated learning (FL) is an effective solution to train machine learning models on the increasing amount of data generated by IoT devices and smartphones while keeping such data localized.
1 code implementation • 21 Nov 2022 • Yann Fraboni, Martin Van Waerebeke, Kevin Scaman, Richard Vidal, Laetitia Kameni, Marco Lorenzi
Machine Unlearning (MU) is an increasingly important topic in machine learning safety, aiming at removing the contribution of a given data point from a training procedure.
no code implementations • 21 Jun 2022 • Yann Fraboni, Richard Vidal, Laetitia Kameni, Marco Lorenzi
We show that our general framework applies to existing optimization schemes including centralized learning, FedAvg, asynchronous FedAvg, and FedBuff.
2 code implementations • 17 Nov 2021 • Othmane Marfoq, Giovanni Neglia, Laetitia Kameni, Richard Vidal
Federated learning allows clients to collaboratively learn statistical models while keeping their data local.
4 code implementations • NeurIPS 2021 • Othmane Marfoq, Giovanni Neglia, Aurélien Bellet, Laetitia Kameni, Richard Vidal
The increasing size of data generated by smartphones and IoT devices motivated the development of Federated Learning (FL), a framework for on-device collaborative training of machine learning models.
1 code implementation • 26 Jul 2021 • Yann Fraboni, Richard Vidal, Laetitia Kameni, Marco Lorenzi
In this work, we provide a general theoretical framework to quantify the impact of a client sampling scheme and of the clients heterogeneity on the federated optimization.
1 code implementation • 12 May 2021 • Yann Fraboni, Richard Vidal, Laetitia Kameni, Marco Lorenzi
This work addresses the problem of optimizing communications between server and clients in federated learning (FL).
1 code implementation • NeurIPS 2020 • Othmane Marfoq, Chuan Xu, Giovanni Neglia, Richard Vidal
Federated learning usually employs a client-server architecture where an orchestrator iteratively aggregates model updates from remote clients and pushes them back a refined model.
1 code implementation • 21 Jun 2020 • Yann Fraboni, Richard Vidal, Marco Lorenzi
Free-rider attacks against federated learning consist in dissimulating participation to the federated learning process with the goal of obtaining the final aggregated model without actually contributing with any data.