Search Results for author: Yuchang Sun

Found 12 papers, 1 papers with code

How to Collaborate: Towards Maximizing the Generalization Performance in Cross-Silo Federated Learning

no code implementations24 Jan 2024 Yuchang Sun, Marios Kountouris, Jun Zhang

We show that the generalization performance of a client can be improved only by collaborating with other clients that have more training data and similar data distribution.

Federated Learning Privacy Preserving

Feature Matching Data Synthesis for Non-IID Federated Learning

no code implementations9 Aug 2023 Zijian Li, Yuchang Sun, Jiawei Shao, Yuyi Mao, Jessie Hui Wang, Jun Zhang

For better privacy preservation, we propose a hard feature augmentation method to transfer real features towards the decision boundary, with which the synthetic data not only improve the model generalization but also erase the information of real features.

Data Augmentation Federated Learning +1

MimiC: Combating Client Dropouts in Federated Learning by Mimicking Central Updates

1 code implementation21 Jun 2023 Yuchang Sun, Yuyi Mao, Jun Zhang

Federated learning (FL) is a promising framework for privacy-preserving collaborative learning, where model training tasks are distributed to clients and only the model updates need to be collected at a server.

Federated Learning Privacy Preserving

Channel and Gradient-Importance Aware Device Scheduling for Over-the-Air Federated Learning

no code implementations26 May 2023 Yuchang Sun, Zehong Lin, Yuyi Mao, Shi Jin, Jun Zhang

In this paper, we propose a probabilistic device scheduling framework for over-the-air FL, named PO-FL, to mitigate the negative impact of channel noise, where each device is scheduled according to a certain probability and its model update is reweighted using this probability in aggregation.

Federated Learning Privacy Preserving +1

DABS: Data-Agnostic Backdoor attack at the Server in Federated Learning

no code implementations2 May 2023 Wenqiang Sun, Sen Li, Yuchang Sun, Jun Zhang

Federated learning (FL) attempts to train a global model by aggregating local models from distributed devices under the coordination of a central server.

Backdoor Attack Federated Learning

Stochastic Coded Federated Learning: Theoretical Analysis and Incentive Mechanism Design

no code implementations8 Nov 2022 Yuchang Sun, Jiawei Shao, Yuyi Mao, Songze Li, Jun Zhang

During training, the server computes gradients on the global coded dataset to compensate for the missing model updates of the straggling devices.

Federated Learning Privacy Preserving

DReS-FL: Dropout-Resilient Secure Federated Learning for Non-IID Clients via Secret Data Sharing

no code implementations6 Oct 2022 Jiawei Shao, Yuchang Sun, Songze Li, Jun Zhang

Federated learning (FL) strives to enable collaborative training of machine learning models without centrally collecting clients' private data.

Federated Learning

Stochastic Coded Federated Learning with Convergence and Privacy Guarantees

no code implementations25 Jan 2022 Yuchang Sun, Jiawei Shao, Songze Li, Yuyi Mao, Jun Zhang

Federated learning (FL) has attracted much attention as a privacy-preserving distributed machine learning framework, where many clients collaboratively train a machine learning model by exchanging model updates with a parameter server instead of sharing their raw data.

Federated Learning Privacy Preserving

Semi-Decentralized Federated Edge Learning with Data and Device Heterogeneity

no code implementations20 Dec 2021 Yuchang Sun, Jiawei Shao, Yuyi Mao, Jessie Hui Wang, Jun Zhang

By exploiting the low-latency communication among edge servers for efficient model sharing, SD-FEEL can incorporate more training data, while enjoying much lower latency compared with conventional federated learning.

Federated Learning Privacy Preserving

Asynchronous Semi-Decentralized Federated Edge Learning for Heterogeneous Clients

no code implementations9 Dec 2021 Yuchang Sun, Jiawei Shao, Yuyi Mao, Jun Zhang

Federated edge learning (FEEL) has drawn much attention as a privacy-preserving distributed learning framework for mobile edge networks.

Privacy Preserving

Semi-Decentralized Federated Edge Learning for Fast Convergence on Non-IID Data

no code implementations26 Apr 2021 Yuchang Sun, Jiawei Shao, Yuyi Mao, Jessie Hui Wang, Jun Zhang

Federated edge learning (FEEL) has emerged as an effective approach to reduce the large communication latency in Cloud-based machine learning solutions, while preserving data privacy.

Federated Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.