D2p-fed:Differentially Private Federated Learning with Efficient Communication

1 Jan 2021  ·  Lun Wang, Ruoxi Jia, Dawn Song ·

In this paper, we propose the discrete Gaussian based differentially private federated learning (D2p-fed), a unified scheme to achieve both differential privacy (DP) and communication efficiency in federated learning (FL). In particular, compared with the only prior work taking care of both aspects, D2p-fed provides stronger privacy guarantee, better composability and smaller communication cost. The key idea is to apply the discrete Gaussian noise to the private data transmission. We provide complete analysis of the privacy guarantee, communication cost and convergence rate of D2p-fed. We evaluated D2p-fed on INFIMNIST and CIFAR10. The results show that D2p-fed outperforms the-state-of-the-art by 6.7% to 9.75% in terms of model accuracy while saving one third of the communication cost. The code for evaluation is available in the supplementary material.

PDF Abstract
No code implementations yet. Submit your code now

Datasets


Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here