Search Results for author: Jingwen Shi

Found 1 papers, 1 papers with code

Handling Data Heterogeneity in Federated Learning via Knowledge Distillation and Fusion

1 code implementation23 Jul 2022 Xu Zhou, Xinyu Lei, Cong Yang, Yichun Shi, Xiao Zhang, Jingwen Shi

The key idea in FedKF is to let the server return the global knowledge to be fused with the local knowledge in each training round so that the local model can be regularized towards the global optima.

Data-free Knowledge Distillation Fairness +2

Cannot find the paper you are looking for? You can Submit a new open access paper.