Search Results for author: Sheng Sun

Found 14 papers, 7 papers with code

Privacy-Preserving Training-as-a-Service for On-Device Intelligence: Concept, Architectural Scheme, and Open Problems

no code implementations16 Apr 2024 Zhiyuan Wu, Sheng Sun, Yuwei Wang, Min Liu, Bo Gao, Tianliu He, Wen Wang

On-device intelligence (ODI) enables artificial intelligence (AI) applications to run on end devices, providing real-time and customized AI services without relying on remote servers.

Federated Learning Privacy Preserving +1

Personalized Federated Learning for Spatio-Temporal Forecasting: A Dual Semantic Alignment-Based Contrastive Approach

no code implementations4 Apr 2024 Qingxiang Liu, Sheng Sun, Yuxuan Liang, Jingjing Xue, Min Liu

From spatial perspective, we design lightweight-but-efficient prototypes as client-level semantic representations, based on which the server evaluates spatial similarity and yields client-customized global prototypes for the supplemented inter-client contrastive task.

Contrastive Learning Personalized Federated Learning +3

Logits Poisoning Attack in Federated Distillation

no code implementations8 Jan 2024 Yuhan Tang, Zhiyuan Wu, Bo Gao, Tian Wen, Yuwei Wang, Sheng Sun

Federated Distillation (FD) is a novel and promising distributed machine learning paradigm, where knowledge distillation is leveraged to facilitate a more efficient and flexible cross-device knowledge transfer in federated learning.

Federated Learning Knowledge Distillation +1

Federated Class-Incremental Learning with New-Class Augmented Self-Distillation

2 code implementations1 Jan 2024 Zhiyuan Wu, Tianliu He, Sheng Sun, Yuwei Wang, Min Liu, Bo Gao, Xuefeng Jiang

Federated Learning (FL) enables collaborative model training among participants while guaranteeing the privacy of raw data.

Class Incremental Learning Federated Learning +2

Improving Communication Efficiency of Federated Distillation via Accumulating Local Updates

1 code implementation7 Dec 2023 Zhiyuan Wu, Sheng Sun, Yuwei Wang, Min Liu, Tian Wen, Wen Wang

ALU drastically decreases the frequency of communication in federated distillation, thereby significantly reducing the communication overhead during the training process.

Federated Learning

Agglomerative Federated Learning: Empowering Larger Model Training via End-Edge-Cloud Collaboration

1 code implementation1 Dec 2023 Zhiyuan Wu, Sheng Sun, Yuwei Wang, Min Liu, Bo Gao, Quyang Pan, Tianliu He, Xuefeng Jiang

Federated Learning (FL) enables training Artificial Intelligence (AI) models over end devices without compromising their privacy.

Federated Learning

Federated Skewed Label Learning with Logits Fusion

no code implementations14 Nov 2023 Yuwei Wang, Runhan Li, Hao Tan, Xuefeng Jiang, Sheng Sun, Min Liu, Bo Gao, Zhiyuan Wu

By fusing the logits of the two models, the private weak learner can capture the variance of different data, regardless of their category.

Federated Learning

FedBIAD: Communication-Efficient and Accuracy-Guaranteed Federated Learning with Bayesian Inference-Based Adaptive Dropout

no code implementations14 Jul 2023 Jingjing Xue, Min Liu, Sheng Sun, Yuwei Wang, Hui Jiang, Xuefeng Jiang

In this paper, we propose Federated learning with Bayesian Inference-based Adaptive Dropout (FedBIAD), which regards weight rows of local models as probability distributions and adaptively drops partial weight rows based on importance indicators correlated with the trend of local training loss.

Bayesian Inference Federated Learning +1

Online Spatio-Temporal Correlation-Based Federated Learning for Traffic Flow Forecasting

no code implementations17 Feb 2023 Qingxiang Liu, Sheng Sun, Min Liu, Yuwei Wang, Bo Gao

In this paper, we perform the first study of forecasting traffic flow adopting Online Learning (OL) manner in FL framework and then propose a novel prediction method named Online Spatio-Temporal Correlation-based Federated Learning (FedOSTC), aiming to guarantee performance gains regardless of traffic fluctuation.

Federated Learning Graph Attention

Knowledge Distillation in Federated Edge Learning: A Survey

1 code implementation14 Jan 2023 Zhiyuan Wu, Sheng Sun, Yuwei Wang, Min Liu, Xuefeng Jiang, Runhan Li, Bo Gao

The increasing demand for intelligent services and privacy protection of mobile and Internet of Things (IoT) devices motivates the wide application of Federated Edge Learning (FEL), in which devices collaboratively train on-device Machine Learning (ML) models without sharing their private data.

Knowledge Distillation

FedICT: Federated Multi-task Distillation for Multi-access Edge Computing

1 code implementation1 Jan 2023 Zhiyuan Wu, Sheng Sun, Yuwei Wang, Min Liu, Quyang Pan, Xuefeng Jiang, Bo Gao

Federated Multi-task Learning (FMTL) is proposed to train related but personalized ML models for different devices, whereas previous works suffer from excessive communication overhead during training and neglect the model heterogeneity among devices in MEC.

Edge-computing Federated Learning +2

Towards Federated Learning against Noisy Labels via Local Self-Regularization

1 code implementation25 Aug 2022 Xuefeng Jiang, Sheng Sun, Yuwei Wang, Min Liu

Federated learning (FL) aims to learn joint knowledge from a large scale of decentralized devices with labeled data in a privacy-preserving manner.

Federated Learning Privacy Preserving

Exploring the Distributed Knowledge Congruence in Proxy-data-free Federated Distillation

2 code implementations14 Apr 2022 Zhiyuan Wu, Sheng Sun, Yuwei Wang, Min Liu, Quyang Pan, Junbo Zhang, Zeju Li, Qingxiang Liu

Federated distillation (FD) is proposed to simultaneously address the above two problems, which exchanges knowledge between the server and clients, supporting heterogeneous local models while significantly reducing communication overhead.

Federated Learning Privacy Preserving

Cannot find the paper you are looking for? You can Submit a new open access paper.