A Stochastic Gradient Langevin Dynamics Algorithm For Noise Intrinsic Federated Learning

1 Jan 2021  ·  Yan Shen, Jian Du, Chunwei Ma, Mingchen Gao, Benyu Zhang ·

Non-i.i.d data distribution and Differential privacy(DP) protections are two open problems in Federated Learning(FL). We address these two problems by proposing the first noise intrinsic FL training algorithms. In our proposed algorithm, we incorporate a stochastic gradient Langevin dynamices(SGLD) oracle in local node's parameter update phase. Our introduced SGLD oracle would lower generalization errors in local node's parameter learning and provide local node DP protections. We theoretically analyze our algorithm by formulating a min-max objective functions and connects its upper bound with global loss function in FL. The convergence of our algorithm on non-convex function is also given as contraction and coupling rate of two random process defined by stochastic differential equations(SDE) We would provide DP analysis for our proposed training algorithm and provide more experiment results soon.

PDF Abstract
No code implementations yet. Submit your code now

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here