Scalable and Differentially Private Distributed Aggregation in the Shuffled Model

19 Jun 2019Badih GhaziRasmus PaghAmeya Velingker

Federated learning promises to make machine learning feasible on distributed, private datasets by implementing gradient descent using secure aggregation methods. The idea is to compute a global weight update without revealing the contributions of individual users... (read more)

PDF Abstract


No code implementations yet. Submit your code now


Results from the Paper

  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.