1 code implementation • 12 Oct 2022 • Bart Cox, Lydia Y. Chen, Jérémie Decouchant
Federated Learning (FL) is a popular approach for distributed deep learning that prevents the pooling of large amounts of data in a central server.
no code implementations • 22 Aug 2022 • Rui Wang, Xingkai Wang, Huanhuan Chen, Jérémie Decouchant, Stjepan Picek, Nikolaos Laoutaris, Kaitai Liang
It is therefore currently impossible to ensure Byzantine robustness and confidentiality of updates without assuming a semi-honest majority.
no code implementations • 17 Aug 2022 • Túlio Pascoal, Jérémie Decouchant, Antoine Boutet, Marcus Völp
We introduce I-GWAS, a novel framework that securely computes and releases the results of multiple possibly interdependent GWASes.
no code implementations • 28 Apr 2022 • Jin Xu, Chi Hong, Jiyue Huang, Lydia Y. Chen, Jérémie Decouchant
Recent reconstruction attacks apply a gradient inversion optimization on the gradient update of a single minibatch to reconstruct the private data used by clients during training.
no code implementations • 23 Apr 2022 • Federico Lucchetti, Jérémie Decouchant, Maria Fernandes, Lydia Y. Chen, Marcus Völp
Federated learning allows clients to collaboratively train models on datasets that are acquired in different locations and that cannot be exchanged because of their size or regulations.