1 code implementation • 9 Apr 2024 • Filip Granqvist, Congzheng Song, Áine Cahill, Rogier Van Dalen, Martin Pelikan, Yi Sheng Chan, Xiaojun Feng, Natarajan Krishnaswami, Vojta Jina, Mona Chitnis
Federated learning (FL) is an emerging machine learning (ML) training paradigm where clients own their data and collaborate to train a global model, without revealing any data to the server and other participants.
1 code implementation • 18 Jul 2022 • Congzheng Song, Filip Granqvist, Kunal Talwar
We believe FLAIR can serve as a challenging benchmark for advancing the state-of-the art in federated learning.
no code implementations • 6 Aug 2020 • Filip Granqvist, Matt Seigel, Rogier Van Dalen, Áine Cahill, Stephen Shum, Matthias Paulik
From these features, the model predicts speaker characteristic labels considered useful as side information.
no code implementations • 16 Feb 2021 • Matthias Paulik, Matt Seigel, Henry Mason, Dominic Telaar, Joris Kluivers, Rogier Van Dalen, Chi Wai Lau, Luke Carlson, Filip Granqvist, Chris Vandevelde, Sudeep Agarwal, Julien Freudiger, Andrew Byde, Abhishek Bhowmick, Gaurav Kapoor, Si Beaumont, Áine Cahill, Dominic Hughes, Omid Javidbakht, Fei Dong, Rehan Rishi, Stanley Hung
We describe the design of our federated task processing system.
no code implementations • 17 Sep 2021 • Borja Rodríguez-Gálvez, Filip Granqvist, Rogier Van Dalen, Matt Seigel
This paper introduces an algorithm to enforce group fairness in private federated learning, where users' data does not leave their devices.
no code implementations • 18 Jul 2022 • MingBin Xu, Congzheng Song, Ye Tian, Neha Agrawal, Filip Granqvist, Rogier Van Dalen, Xiao Zhang, Arturo Argueta, Shiyi Han, Yaqiao Deng, Leo Liu, Anmol Walia, Alex Jin
Our goal is to train a large neural network language model (NNLM) on compute-constrained devices while preserving privacy using FL and DP.
no code implementations • 27 Jul 2023 • Kunal Talwar, Shan Wang, Audra McMillan, Vojta Jina, Vitaly Feldman, Bailey Basile, Aine Cahill, Yi Sheng Chan, Mike Chatzidakis, Junye Chen, Oliver Chick, Mona Chitnis, Suman Ganta, Yusuf Goren, Filip Granqvist, Kristine Guo, Frederic Jacobs, Omid Javidbakht, Albert Liu, Richard Low, Dan Mascenik, Steve Myers, David Park, Wonhee Park, Gianni Parsa, Tommy Pauly, Christian Priebe, Rehan Rishi, Guy Rothblum, Michael Scaria, Linmao Song, Congzheng Song, Karl Tarbe, Sebastian Vogt, Luke Winstrom, Shundong Zhou
We revisit the problem of designing scalable protocols for private statistics and private federated learning when each device holds its private data.