no code implementations • 16 Aug 2023 • Qinglun Li, Li Shen, Guanghao Li, Quanjun Yin, DaCheng Tao
To address the communication burden issues associated with federated learning (FL), decentralized federated learning (DFL) discards the central server and establishes a decentralized communication network, where each client communicates only with neighboring clients.
no code implementations • 15 Mar 2023 • Guanghao Li, Wansen Wu, Yan Sun, Li Shen, Baoyuan Wu, DaCheng Tao
Then, the local model is trained on the input composed of raw data and a visual prompt to learn the distribution information contained in the prompt.
no code implementations • 24 Feb 2023 • Guanghao Li, Li Shen, Yan Sun, Yue Hu, Han Hu, DaCheng Tao
Federated learning (FL) enables multiple clients to train a machine learning model collaboratively without exchanging their local data.
no code implementations • 21 Jun 2022 • Guanghao Li, Yue Hu, Miao Zhang, Ji Liu, Quanjun Yin, Yong Peng, Dejing Dou
As the efficiency of training in the ring topology prefers devices with homogeneous resources, the classification based on the computing capacity mitigates the impact of straggler effects.
no code implementations • 25 Jan 2021 • Ning Ge, Guanghao Li, Li Zhang, Yi Liu Yi Liu
Data protection across organizations is limiting the application of centralized learning (CL) techniques.
no code implementations • 1 Dec 2020 • Yi Liu, Li Zhang, Ning Ge, Guanghao Li
In this process, the server uses an incentive mechanism to encourage clients to contribute high-quality and large-volume data to improve the global model.