Search Results for author: Duo Liu

Found 6 papers, 4 papers with code

Flexible Clustered Federated Learning for Client-Level Data Distribution Shift

1 code implementation22 Aug 2021 Moming Duan, Duo Liu, Xinyuan Ji, Yu Wu, Liang Liang, Xianzhang Chen, Yujuan Tan

Federated Learning (FL) enables the multiple participating devices to collaboratively contribute to a global neural network model while keeping the training data locally.

Federated Learning

CSAFL: A Clustered Semi-Asynchronous Federated Learning Framework

no code implementations16 Apr 2021 Yu Zhang, Moming Duan, Duo Liu, Li Li, Ao Ren, Xianzhang Chen, Yujuan Tan, Chengliang Wang

Asynchronous FL has a natural advantage in mitigating the straggler effect, but there are threats of model quality degradation and server crash.

Federated Learning

FedGroup: Efficient Clustered Federated Learning via Decomposed Data-Driven Measure

2 code implementations14 Oct 2020 Moming Duan, Duo Liu, Xinyuan Ji, Renping Liu, Liang Liang, Xianzhang Chen, Yujuan Tan

In this paper, we propose a novel clustered federated learning (CFL) framework FedGroup, in which we 1) group the training of clients based on the similarities between the clients' optimization directions for high training performance; 2) construct a new data-driven distance measure to improve the efficiency of the client clustering procedure.

Clustering Federated Learning

A Generic Network Compression Framework for Sequential Recommender Systems

1 code implementation21 Apr 2020 Yang Sun, Fajie Yuan, Min Yang, Guoao Wei, Zhou Zhao, Duo Liu

Current state-of-the-art sequential recommender models are typically based on a sandwich-structured deep neural network, where one or more middle (hidden) layers are placed between the input embedding layer and output softmax layer.

Sequential Recommendation

Astraea: Self-balancing Federated Learning for Improving Classification Accuracy of Mobile Deep Learning Applications

1 code implementation2 Jul 2019 Moming Duan, Duo Liu, Xianzhang Chen, Yujuan Tan, Jinting Ren, Lei Qiao, Liang Liang

However, unlike the common training dataset, the data distribution of the edge computing system is imbalanced which will introduce biases in the model training and cause a decrease in accuracy of federated learning applications.

Data Augmentation Edge-computing +2

Cannot find the paper you are looking for? You can Submit a new open access paper.