Two-Bit Aggregation for Communication Efficient and Differentially Private Federated Learning

6 Oct 2021  ·  Mohammad Aghapour, Aidin Ferdowsi, Walid Saad ·

In federated learning (FL), a machine learning model is trained on multiple nodes in a decentralized manner, while keeping the data local and not shared with other nodes. However, FL requires the nodes to also send information on the model parameters to a central server for aggregation. However, the information sent from the nodes to the server may reveal some details about each node's local data, thus raising privacy concerns. Furthermore, the repetitive uplink transmission from the nodes to the server may result in a communication overhead and network congestion. To address these two challenges, in this paper, a novel two-bit aggregation algorithm is proposed with guaranteed differential privacy and reduced uplink communication overhead. Extensive experiments demonstrate that the proposed aggregation algorithm can achieve the same performance as state-of-the-art approaches on datasets such as MNIST, Fashion MNIST, CIFAR-10, and CIFAR-100, while ensuring differential privacy and improving communication efficiency.

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here