Search Results for author: Juncheng Jia

Found 4 papers, 0 papers with code

Efficient Asynchronous Federated Learning with Sparsification and Quantization

no code implementations23 Dec 2023 Juncheng Jia, Ji Liu, Chendi Zhou, Hao Tian, Mianxiong Dong, Dejing Dou

As the bandwidth between the devices and the server is relatively low, the communication of intermediate data becomes a bottleneck.

Federated Learning Quantization

Multi-Job Intelligent Scheduling with Cross-Device Federated Learning

no code implementations24 Nov 2022 Ji Liu, Juncheng Jia, Beichen Ma, Chendi Zhou, Jingbo Zhou, Yang Zhou, Huaiyu Dai, Dejing Dou

The system model enables a parallel training process of multiple jobs, with a cost model based on the data fairness and the training time of diverse devices during the parallel training process.

Bayesian Optimization Fairness +2

FedDUAP: Federated Learning with Dynamic Update and Adaptive Pruning Using Shared Data on the Server

no code implementations25 Apr 2022 Hong Zhang, Ji Liu, Juncheng Jia, Yang Zhou, Huaiyu Dai, Dejing Dou

Despite achieving remarkable performance, Federated Learning (FL) suffers from two critical challenges, i. e., limited computational resources and low training efficiency.

Federated Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.