no code implementations • 2 Feb 2024 • Liping Yi, Han Yu, Chao Ren, Heng Zhang, Gang Wang, Xiaoguang Liu, Xiaoxiao Li
It assigns a shared homogeneous small feature extractor and a local gating network for each client's local heterogeneous large model.
1 code implementation • 14 Dec 2023 • Liping Yi, Han Yu, Zhuan Shi, Gang Wang, Xiaoguang Liu, Lizhen Cui, Xiaoxiao Li
Existing MHPFL approaches often rely on a public dataset with the same nature as the learning task, or incur high computation and communication costs.
no code implementations • 12 Nov 2023 • Liping Yi, Han Yu, Gang Wang, Xiaoguang Liu
To allow each data owner (a. k. a., FL clients) to train a heterogeneous and personalized local model based on its local data distribution, system resources and requirements on model structure, the field of model-heterogeneous personalized federated learning (MHPFL) has emerged.
no code implementations • 20 Oct 2023 • Liping Yi, Han Yu, Gang Wang, Xiaoguang Liu, Xiaoxiao Li
Federated learning (FL) is an emerging machine learning paradigm in which a central server coordinates multiple participants (clients) collaboratively to train on decentralized data.
3 code implementations • 23 Mar 2023 • Liping Yi, Gang Wang, Xiaoguang Liu, Zhuan Shi, Han Yu
It is a communication and computation-efficient model-heterogeneous FL framework which trains a shared generalized global prediction header with representations extracted by heterogeneous extractors for clients' models at the FL server.