Search Results for author: Xiaoxin Su

Found 5 papers, 0 papers with code

Fed-CVLC: Compressing Federated Learning Communications with Variable-Length Codes

no code implementations6 Feb 2024 Xiaoxin Su, Yipeng Zhou, Laizhong Cui, John C. S. Lui, Jiangchuan Liu

In Federated Learning (FL) paradigm, a parameter server (PS) concurrently communicates with distributed participating clients for model collection, update aggregation, and model distribution over multiple rounds, without touching private data owned by individual clients.

Federated Learning Model Compression +1

A Fast Blockchain-based Federated Learning Framework with Compressed Communications

no code implementations12 Aug 2022 Laizhong Cui, Xiaoxin Su, Yipeng Zhou

Recently, blockchain-based federated learning (BFL) has attracted intensive research attention due to that the training process is auditable and the architecture is serverless avoiding the single point failure of the parameter server in vanilla federated learning (VFL).

Federated Learning

Optimal Rate Adaption in Federated Learning with Compressed Communications

no code implementations13 Dec 2021 Laizhong Cui, Xiaoxin Su, Yipeng Zhou, Jiangchuan Liu

Federated Learning (FL) incurs high communication overhead, which can be greatly alleviated by compression for model updates.

Federated Learning

Slashing Communication Traffic in Federated Learning by Transmitting Clustered Model Updates

no code implementations10 May 2021 Laizhong Cui, Xiaoxin Su, Yipeng Zhou, Yi Pan

Then, we further propose the boosted MUCSC (B-MUCSC) algorithm, a biased compression algorithm that can achieve an extremely high compression rate by grouping insignificant model updates into a super cluster.

Federated Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.