Search Results for author: Giovanni Neglia

Found 16 papers, 7 papers with code

A Cautionary Tale: On the Role of Reference Data in Empirical Privacy Defenses

no code implementations18 Oct 2023 Caelin G. Kaplan, Chuan Xu, Othmane Marfoq, Giovanni Neglia, Anderson Santana de Oliveira

Within the realm of privacy-preserving machine learning, empirical privacy defenses have been proposed as a solution to achieve satisfactory levels of training data privacy without a significant drop in model utility.

Privacy Preserving

No-Regret Caching with Noisy Request Estimates

no code implementations5 Sep 2023 Younes Ben Mazziane, Francescomaria Faticanti, Giovanni Neglia, Sara Alouf

Online learning algorithms have been successfully used to design caching policies with regret guarantees.

FedDec: Peer-to-peer Aided Federated Learning

no code implementations11 Jun 2023 Marina Costantini, Giovanni Neglia, Thrasyvoulos Spyropoulos

We analyze the convergence of FedDec under the assumptions of non-iid data distribution, partial device participation, and smooth and strongly convex costs, and show that inter-agent communication alleviates the negative impact of infrequent communication rounds with the server by reducing the dependence on the number of local updates $H$ from $O(H^2)$ to $O(H)$.

Federated Learning

Improved Stability and Generalization Guarantees of the Decentralized SGD Algorithm

no code implementations5 Jun 2023 Batiste Le Bars, Aurélien Bellet, Marc Tommasi, Kevin Scaman, Giovanni Neglia

On the contrary, we show, for convex, strongly convex and non-convex functions, that D-SGD can always recover generalization bounds analogous to those of classical SGD, suggesting that the choice of graph does not matter.

Generalization Bounds

Federated Learning under Heterogeneous and Correlated Client Availability

1 code implementation11 Jan 2023 Angelo Rodio, Francescomaria Faticanti, Othmane Marfoq, Giovanni Neglia, Emilio Leonardi

To this purpose, CA-Fed dynamically adapts the weight given to each client and may ignore clients with low availability and large correlation.

Federated Learning

Federated Learning for Data Streams

1 code implementation4 Jan 2023 Othmane Marfoq, Giovanni Neglia, Laetitia Kameni, Richard Vidal

Federated learning (FL) is an effective solution to train machine learning models on the increasing amount of data generated by IoT devices and smartphones while keeping such data localized.

Federated Learning

Local Model Reconstruction Attacks in Federated Learning and their Uses

no code implementations28 Oct 2022 Ilias Driouich, Chuan Xu, Giovanni Neglia, Frederic Giroire, Eoin Thomas

Additionally, we propose a novel model-based attribute inference attack in federated learning leveraging the local model reconstruction attack.

Attribute Earnings Classification +4

Personalized Federated Learning through Local Memorization

2 code implementations17 Nov 2021 Othmane Marfoq, Giovanni Neglia, Laetitia Kameni, Richard Vidal

Federated learning allows clients to collaboratively learn statistical models while keeping their data local.

Binary Classification Fairness +3

Efficient passive membership inference attack in federated learning

1 code implementation31 Oct 2021 Oualid Zari, Chuan Xu, Giovanni Neglia

In cross-device federated learning (FL) setting, clients such as mobiles cooperate with the server to train a global machine learning model, while maintaining their data locally.

Federated Learning Inference Attack +1

Federated Multi-Task Learning under a Mixture of Distributions

4 code implementations NeurIPS 2021 Othmane Marfoq, Giovanni Neglia, Aurélien Bellet, Laetitia Kameni, Richard Vidal

The increasing size of data generated by smartphones and IoT devices motivated the development of Federated Learning (FL), a framework for on-device collaborative training of machine learning models.

Fairness Multi-Task Learning +1

Content Placement in Networks of Similarity Caches

no code implementations9 Feb 2021 Michele Garetto, Emilio Leonardi, Giovanni Neglia

Similarity caching systems have recently attracted the attention of the scientific community, as they can be profitably used in many application contexts, like multimedia retrieval, advertising, object recognition, recommender systems and online content-match applications.

Object Object Recognition +2

Throughput-Optimal Topology Design for Cross-Silo Federated Learning

1 code implementation NeurIPS 2020 Othmane Marfoq, Chuan Xu, Giovanni Neglia, Richard Vidal

Federated learning usually employs a client-server architecture where an orchestrator iteratively aggregates model updates from remote clients and pushes them back a refined model.

Federated Learning

Dynamic backup workers for parallel machine learning

no code implementations30 Apr 2020 Chuan Xu, Giovanni Neglia, Nicola Sebastianelli

This paradigm consists of $n$ workers, which iteratively compute updates of the model parameters, and a stateful PS, which waits and aggregates all updates to generate a new estimate of model parameters and sends it back to the workers for a new iteration.

BIG-bench Machine Learning

Decentralized gradient methods: does topology matter?

no code implementations28 Feb 2020 Giovanni Neglia, Chuan Xu, Don Towsley, Gianmarco Calbi

Consensus-based distributed optimization methods have recently been advocated as alternatives to parameter server and ring all-reduce paradigms for large scale training of machine learning models.

Distributed Optimization

Cannot find the paper you are looking for? You can Submit a new open access paper.