no code implementations • 16 May 2024 • Tianqu Kang, Lumin Liu, Hengtao He, Jun Zhang, S. H. Song, Khaled B. Letaief
To enhance privacy, FL can be combined with Differential Privacy (DP), which involves adding Gaussian noise to the model weights.
no code implementations • 7 Aug 2023 • Lumin Liu, Jun Zhang, Shenghui Song, Khaled B. Letaief
To improve communication efficiency and achieve a better privacy-utility trade-off, we propose a communication-efficient FL training algorithm with differential privacy guarantee.
no code implementations • 20 Jul 2023 • Jiawei Shao, Zijian Li, Wenqiang Sun, Tailin Zhou, Yuchang Sun, Lumin Liu, Zehong Lin, Yuyi Mao, Jun Zhang
Without data centralization, FL allows clients to share local information in a privacy-preserving manner.
no code implementations • 14 Mar 2022 • Lumin Liu, Jun Zhang, S. H. Song, Khaled B. Letaief
Federated Distillation (FD) is a recently proposed alternative to enable communication-efficient and robust FL, which achieves orders of magnitude reduction of the communication overhead compared with FedAvg and is flexible to handle heterogeneous models at the clients.
no code implementations • 26 Mar 2021 • Lumin Liu, Jun Zhang, Shenghui Song, Khaled B. Letaief
Hierarchical FL, with a client-edge-cloud aggregation hierarchy, can effectively leverage both the cloud server's access to many clients' data and the edge servers' closeness to the clients to achieve a high communication efficiency.
1 code implementation • 16 May 2019 • Lumin Liu, Jun Zhang, S. H. Song, Khaled B. Letaief
To combine their advantages, we propose a client-edge-cloud hierarchical Federated Learning system, supported with a HierFAVG algorithm that allows multiple edge servers to perform partial model aggregation.