Search Results for author: Lumin Liu

Found 5 papers, 1 papers with code

Binary Federated Learning with Client-Level Differential Privacy

no code implementations7 Aug 2023 Lumin Liu, Jun Zhang, Shenghui Song, Khaled B. Letaief

To improve communication efficiency and achieve a better privacy-utility trade-off, we propose a communication-efficient FL training algorithm with differential privacy guarantee.

Federated Learning Privacy Preserving

Communication-Efficient Federated Distillation with Active Data Sampling

no code implementations14 Mar 2022 Lumin Liu, Jun Zhang, S. H. Song, Khaled B. Letaief

Federated Distillation (FD) is a recently proposed alternative to enable communication-efficient and robust FL, which achieves orders of magnitude reduction of the communication overhead compared with FedAvg and is flexible to handle heterogeneous models at the clients.

Federated Learning Privacy Preserving +1

Hierarchical Federated Learning with Quantization: Convergence Analysis and System Design

no code implementations26 Mar 2021 Lumin Liu, Jun Zhang, Shenghui Song, Khaled B. Letaief

Hierarchical FL, with a client-edge-cloud aggregation hierarchy, can effectively leverage both the cloud server's access to many clients' data and the edge servers' closeness to the clients to achieve a high communication efficiency.

Federated Learning Quantization

Client-Edge-Cloud Hierarchical Federated Learning

1 code implementation16 May 2019 Lumin Liu, Jun Zhang, S. H. Song, Khaled B. Letaief

To combine their advantages, we propose a client-edge-cloud hierarchical Federated Learning system, supported with a HierFAVG algorithm that allows multiple edge servers to perform partial model aggregation.

Federated Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.