Search Results for author: Laizhong Cui

Found 8 papers, 1 papers with code

Fed-CVLC: Compressing Federated Learning Communications with Variable-Length Codes

no code implementations6 Feb 2024 Xiaoxin Su, Yipeng Zhou, Laizhong Cui, John C. S. Lui, Jiangchuan Liu

In Federated Learning (FL) paradigm, a parameter server (PS) concurrently communicates with distributed participating clients for model collection, update aggregation, and model distribution over multiple rounds, without touching private data owned by individual clients.

Federated Learning Model Compression +1

DAGC: Data-Volume-Aware Adaptive Sparsification Gradient Compression for Distributed Machine Learning in Mobile Computing

no code implementations13 Nov 2023 Rongwei Lu, Yutong Jiang, Yinan Mao, Chen Tang, Bin Chen, Laizhong Cui, Zhi Wang

Assigning varying compression ratios to workers with distinct data distributions and volumes is thus a promising solution.

Boost Decentralized Federated Learning in Vehicular Networks by Diversifying Data Sources

no code implementations5 Sep 2022 Dongyuan Su, Yipeng Zhou, Laizhong Cui

To boost the convergence of DFL, a vehicle tunes the aggregation weight of each data source by minimizing the KL divergence of its state vector, and its effectiveness in diversifying data sources can be theoretically proved.

Federated Learning

A Fast Blockchain-based Federated Learning Framework with Compressed Communications

no code implementations12 Aug 2022 Laizhong Cui, Xiaoxin Su, Yipeng Zhou

Recently, blockchain-based federated learning (BFL) has attracted intensive research attention due to that the training process is auditable and the architecture is serverless avoiding the single point failure of the parameter server in vanilla federated learning (VFL).

Federated Learning

Magic ELF: Image Deraining Meets Association Learning and Transformer

1 code implementation21 Jul 2022 Kui Jiang, Zhongyuan Wang, Chen Chen, Zheng Wang, Laizhong Cui, Chia-Wen Lin

Convolutional neural network (CNN) and Transformer have achieved great success in multimedia applications.

Rain Removal

Optimal Rate Adaption in Federated Learning with Compressed Communications

no code implementations13 Dec 2021 Laizhong Cui, Xiaoxin Su, Yipeng Zhou, Jiangchuan Liu

Federated Learning (FL) incurs high communication overhead, which can be greatly alleviated by compression for model updates.

Federated Learning

Slashing Communication Traffic in Federated Learning by Transmitting Clustered Model Updates

no code implementations10 May 2021 Laizhong Cui, Xiaoxin Su, Yipeng Zhou, Yi Pan

Then, we further propose the boosted MUCSC (B-MUCSC) algorithm, a biased compression algorithm that can achieve an extremely high compression rate by grouping insignificant model updates into a super cluster.

Federated Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.