1 code implementation • 9 Apr 2024 • Filip Granqvist, Congzheng Song, Áine Cahill, Rogier Van Dalen, Martin Pelikan, Yi Sheng Chan, Xiaojun Feng, Natarajan Krishnaswami, Vojta Jina, Mona Chitnis
Federated learning (FL) is an emerging machine learning (ML) training paradigm where clients own their data and collaborate to train a global model, without revealing any data to the server and other participants.
no code implementations • 14 Feb 2024 • Tao Yu, Congzheng Song, Jianyu Wang, Mona Chitnis
Asynchronous protocols have been shown to improve the scalability of federated learning (FL) with a massive number of clients.
no code implementations • 27 Jul 2023 • Kunal Talwar, Shan Wang, Audra McMillan, Vojta Jina, Vitaly Feldman, Bailey Basile, Aine Cahill, Yi Sheng Chan, Mike Chatzidakis, Junye Chen, Oliver Chick, Mona Chitnis, Suman Ganta, Yusuf Goren, Filip Granqvist, Kristine Guo, Frederic Jacobs, Omid Javidbakht, Albert Liu, Richard Low, Dan Mascenik, Steve Myers, David Park, Wonhee Park, Gianni Parsa, Tommy Pauly, Christian Priebe, Rehan Rishi, Guy Rothblum, Michael Scaria, Linmao Song, Congzheng Song, Karl Tarbe, Sebastian Vogt, Luke Winstrom, Shundong Zhou
We revisit the problem of designing scalable protocols for private statistics and private federated learning when each device holds its private data.
no code implementations • 14 Jul 2023 • Tatsuki Koga, Congzheng Song, Martin Pelikan, Mona Chitnis
Federated learning (FL) combined with differential privacy (DP) offers machine learning (ML) training with distributed devices and with a formal privacy guarantee.