Search Results for author: Tim Korjakow

Found 3 papers, 2 papers with code

Explanation Strategies as an Empirical-Analytical Lens for Socio-Technical Contextualization of Machine Learning Interpretability

1 code implementation24 Sep 2021 Jesse Josua Benjamin, Christoph Kinkeldey, Claudia Müller-Birn, Tim Korjakow, Eva-Maria Herbst

During a research project in which we developed a machine learning (ML) driven visualization system for non-ML experts, we reflected on interpretability research in ML, computer-supported collaborative work and human-computer interaction.

Philosophy

FedAUX: Leveraging Unlabeled Auxiliary Data in Federated Learning

1 code implementation4 Feb 2021 Felix Sattler, Tim Korjakow, Roman Rischke, Wojciech Samek

Federated Distillation (FD) is a popular novel algorithmic paradigm for Federated Learning, which achieves training performance competitive to prior parameter averaging based methods, while additionally allowing the clients to train different model architectures, by distilling the client predictions on an unlabeled auxiliary set of data into a student model.

Federated Learning Unsupervised Pre-training

Cannot find the paper you are looking for? You can Submit a new open access paper.