Search Results for author: Richard Vidal

Found 11 papers, 9 papers with code

Evaluating the Energy Consumption of Machine Learning: Systematic Literature Review and Experiments

no code implementations27 Aug 2024 Charlotte Rodriguez, Laura Degioanni, Laetitia Kameni, Richard Vidal, Giovanni Neglia

Monitoring, understanding, and optimizing the energy consumption of Machine Learning (ML) are various reasons why it is necessary to evaluate the energy usage of ML.

Fed-BioMed: Open, Transparent and Trusted Federated Learning for Real-world Healthcare Applications

1 code implementation24 Apr 2023 Francesco Cremonesi, Marc Vesin, Sergen Cansiz, Yannick Bouillard, Irene Balelli, Lucia Innocenti, Santiago Silva, Samy-Safwan Ayed, Riccardo Taiello, Laetita Kameni, Richard Vidal, Fanny Orlhac, Christophe Nioche, Nathan Lapel, Bastien Houis, Romain Modzelewski, Olivier Humbert, Melek Önen, Marco Lorenzi

The real-world implementation of federated learning is complex and requires research and development actions at the crossroad between different domains ranging from data science, to software programming, networking, and security.

Federated Learning

Federated Learning for Data Streams

1 code implementation4 Jan 2023 Othmane Marfoq, Giovanni Neglia, Laetitia Kameni, Richard Vidal

Federated learning (FL) is an effective solution to train machine learning models on the increasing amount of data generated by IoT devices and smartphones while keeping such data localized.

Federated Learning

SIFU: Sequential Informed Federated Unlearning for Efficient and Provable Client Unlearning in Federated Optimization

1 code implementation21 Nov 2022 Yann Fraboni, Martin Van Waerebeke, Kevin Scaman, Richard Vidal, Laetitia Kameni, Marco Lorenzi

Machine Unlearning (MU) is an increasingly important topic in machine learning safety, aiming at removing the contribution of a given data point from a training procedure.

Machine Unlearning

A General Theory for Federated Optimization with Asynchronous and Heterogeneous Clients Updates

no code implementations21 Jun 2022 Yann Fraboni, Richard Vidal, Laetitia Kameni, Marco Lorenzi

We show that our general framework applies to existing optimization schemes including centralized learning, FedAvg, asynchronous FedAvg, and FedBuff.

Federated Learning

Personalized Federated Learning through Local Memorization

2 code implementations17 Nov 2021 Othmane Marfoq, Giovanni Neglia, Laetitia Kameni, Richard Vidal

Federated learning allows clients to collaboratively learn statistical models while keeping their data local.

Binary Classification Fairness +3

Federated Multi-Task Learning under a Mixture of Distributions

4 code implementations NeurIPS 2021 Othmane Marfoq, Giovanni Neglia, Aurélien Bellet, Laetitia Kameni, Richard Vidal

The increasing size of data generated by smartphones and IoT devices motivated the development of Federated Learning (FL), a framework for on-device collaborative training of machine learning models.

Fairness Multi-Task Learning +1

A General Theory for Client Sampling in Federated Learning

1 code implementation26 Jul 2021 Yann Fraboni, Richard Vidal, Laetitia Kameni, Marco Lorenzi

In this work, we provide a general theoretical framework to quantify the impact of a client sampling scheme and of the clients heterogeneity on the federated optimization.

Federated Learning

Throughput-Optimal Topology Design for Cross-Silo Federated Learning

1 code implementation NeurIPS 2020 Othmane Marfoq, Chuan Xu, Giovanni Neglia, Richard Vidal

Federated learning usually employs a client-server architecture where an orchestrator iteratively aggregates model updates from remote clients and pushes them back a refined model.

Federated Learning

Free-rider Attacks on Model Aggregation in Federated Learning

1 code implementation21 Jun 2020 Yann Fraboni, Richard Vidal, Marco Lorenzi

Free-rider attacks against federated learning consist in dissimulating participation to the federated learning process with the goal of obtaining the final aggregated model without actually contributing with any data.

Federated Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.