Efficient Federated Learning over Multiple Access Channel with Differential Privacy Constraints

15 May 2020  ·  Amir Sonee, Stefano Rini ·

In this paper, the problem of federated learning (FL) through digital communication between clients and a parameter server (PS) over a multiple access channel (MAC), also subject to differential privacy (DP) constraints, is studied. More precisely, we consider the setting in which clients in a centralized network are prompted to train a machine learning model using their local datasets. The information exchange between the clients and the PS takes places over a MAC channel and must also preserve the DP of the local datasets. Accordingly, the objective of the clients is to minimize the training loss subject to (i) rate constraints for reliable communication over the MAC and (ii) DP constraint over the local datasets. For this optimization scenario, we proposed a novel consensus scheme in which digital distributed stochastic gradient descent (D-DSGD) is performed by each client. To preserve DP, a digital artificial noise is also added by the users to the locally quantized gradients. The performance of the scheme is evaluated in terms of the convergence rate and DP level for a given MAC capacity. The performance is optimized over the choice of the quantization levels and the artificial noise parameters. Numerical evaluations are presented to validate the performance of the proposed scheme.

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here