Search Results for author: Roman Rischke

Found 3 papers, 1 papers with code

FedAUXfdp: Differentially Private One-Shot Federated Distillation

no code implementations30 May 2022 Haley Hoech, Roman Rischke, Karsten Müller, Wojciech Samek

Federated learning suffers in the case of non-iid local datasets, i. e., when the distributions of the clients' data are heterogeneous.

Federated Learning

FedAUX: Leveraging Unlabeled Auxiliary Data in Federated Learning

1 code implementation4 Feb 2021 Felix Sattler, Tim Korjakow, Roman Rischke, Wojciech Samek

Federated Distillation (FD) is a popular novel algorithmic paradigm for Federated Learning, which achieves training performance competitive to prior parameter averaging based methods, while additionally allowing the clients to train different model architectures, by distilling the client predictions on an unlabeled auxiliary set of data into a student model.

Federated Learning Unsupervised Pre-training

Communication-Efficient Federated Distillation

no code implementations1 Dec 2020 Felix Sattler, Arturo Marban, Roman Rischke, Wojciech Samek

Communication constraints are one of the major challenges preventing the wide-spread adoption of Federated Learning systems.

Federated Learning Image Classification +2

Cannot find the paper you are looking for? You can Submit a new open access paper.