no code implementations • 4 Feb 2023 • Xiangrong Zhu, Guangyao Li, Wei Hu
To cope with the drift between local optimization and global convergence caused by data heterogeneity, we propose mutual knowledge distillation to transfer local knowledge to global, and absorb global knowledge back.