Iterative Sketching and its Application to Federated Learning

29 Sep 2021  ·  Zhao Song, Zheng Yu, Lichen Zhang ·

Johnson-Lindenstrauss lemma is one of the most valuable tools in machine learning, since it enables the reduction to the dimension of various learning problems. In this paper, we exploit the power of Fast-JL transform or so-called sketching technique and apply it to federated learning settings. Federated learning is an emerging learning scheme which allows multiple clients to train models without data exchange. Though most federated learning frameworks only require clients and the server to send gradient information over the network, they still face the challenges of communication efficiency and data privacy. We show that by iteratively applying independent sketches combined with additive noises, one can achieve the above two goals simultaneously. In our designed framework, each client only passes a sketched gradient to the server, and de-sketches the average-gradient information received from the server to synchronize. Such framework enjoys several benefits: 1). Better privacy, since we only exchange randomly sketched gradients with low-dimensional noises, which is more robust against emerging gradient attacks; 2). Lower communication cost per round, since our framework only communicates low-dimensional sketched gradients, which is particularly valuable in a small-bandwidth channel; 3). No extra overall communication cost. We provably show that the introduced randomness does not increase the overall communication at all.

PDF Abstract
No code implementations yet. Submit your code now

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here