Search Results for author: Shusen Yang

Found 13 papers, 2 papers with code

FedLED: Label-Free Equipment Fault Diagnosis with Vertical Federated Transfer Learning

no code implementations29 Dec 2023 Jie Shen, Shusen Yang, Cong Zhao, Xuebin Ren, Peng Zhao, Yuqian Yang, Qing Han, Shuaijun Wu

Intelligent equipment fault diagnosis based on Federated Transfer Learning (FTL) attracts considerable attention from both academia and industry.

Transfer Learning

Generative Model-based Feature Knowledge Distillation for Action Recognition

1 code implementation14 Dec 2023 Guiqin Wang, Peng Zhao, Yanjiang Shi, Cong Zhao, Shusen Yang

Addressing this gap, our paper introduces an innovative knowledge distillation framework, with the generative model for training a lightweight student model.

Action Detection Action Recognition +3

HiFlash: Communication-Efficient Hierarchical Federated Learning with Adaptive Staleness Control and Heterogeneity-aware Client-Edge Association

no code implementations16 Jan 2023 Qiong Wu, Xu Chen, Tao Ouyang, Zhi Zhou, Xiaoxi Zhang, Shusen Yang, Junshan Zhang

Federated learning (FL) is a promising paradigm that enables collaboratively learning a shared model across massive clients while keeping the training data locally.

Edge-computing Federated Learning

ACE: Towards Application-Centric Edge-Cloud Collaborative Intelligence

1 code implementation24 Mar 2022 Luhui Wang, Cong Zhao, Shusen Yang, Xinyu Yang, Julie McCann

Intelligent applications based on machine learning are impacting many parts of our lives.

Management

Towards Efficient and Stable K-Asynchronous Federated Learning with Unbounded Stale Gradients on Non-IID Data

no code implementations2 Mar 2022 ZiHao Zhou, Yanan Li, Xuebin Ren, Shusen Yang

Federated learning (FL) is an emerging privacy-preserving paradigm that enables multiple participants collaboratively to train a global model without uploading raw data.

Federated Learning Privacy Preserving

Eigenvalues of the Laplace operator with potential under the backward Ricci flow on locally homogeneous 3-manifolds

no code implementations29 Jan 2021 Songbo Hou, Shusen Yang

Let $\lambda(t)$ be the first eigenvalue of $-\Delta+aR\, (a>0)$ under the backward Ricci flow on locally homogeneous 3-manifolds, where $R$ is the scalar curvature.

Differential Geometry 53E20, 58C40

Latent Dirichlet Allocation Model Training with Differential Privacy

no code implementations9 Oct 2020 Fangyuan Zhao, Xuebin Ren, Shusen Yang, Qing Han, Peng Zhao, Xinyu Yang

To address the privacy issue in LDA, we systematically investigate the privacy protection of the main-stream LDA training algorithm based on Collapsed Gibbs Sampling (CGS) and propose several differentially private LDA algorithms for typical training scenarios.

Privacy Preserving

CDC: Classification Driven Compression for Bandwidth Efficient Edge-Cloud Collaborative Deep Learning

no code implementations4 May 2020 Yuanrui Dong, Peng Zhao, Hanqiao Yu, Cong Zhao, Shusen Yang

The emerging edge-cloud collaborative Deep Learning (DL) paradigm aims at improving the performance of practical DL implementations in terms of cloud bandwidth consumption, response latency, and data privacy preservation.

Classification General Classification +1

OL4EL: Online Learning for Edge-cloud Collaborative Learning on Heterogeneous Edges with Resource Constraints

no code implementations22 Apr 2020 Qing Han, Shusen Yang, Xuebin Ren, Cong Zhao, Jingqi Zhang, Xinyu Yang

However, heterogeneous and limited computation and communication resources on edge servers (or edges) pose great challenges on distributed ML and formulate a new paradigm of Edge Learning (i. e. edge-cloud collaborative machine learning).

BIG-bench Machine Learning

Asynchronous Federated Learning with Differential Privacy for Edge Intelligence

no code implementations17 Dec 2019 Yanan Li, Shusen Yang, Xuebin Ren, Cong Zhao

Formally, we give the first analysis on the model convergence of AFL under DP and propose a multi-stage adjustable private algorithm (MAPA) to improve the trade-off between model utility and privacy by dynamically adjusting both the noise scale and the learning rate.

Edge-computing Federated Learning

Impact of Prior Knowledge and Data Correlation on Privacy Leakage: A Unified Analysis

no code implementations5 Jun 2019 Yanan Li, Xuebin Ren, Shusen Yang, Xinyu Yang

Considering general correlations, a closed-form expression of privacy leakage is derived for continuous data, and a chain rule is presented for discrete data.

valid

On Privacy Protection of Latent Dirichlet Allocation Model Training

no code implementations4 Jun 2019 Fangyuan Zhao, Xuebin Ren, Shusen Yang, Xinyu Yang

Latent Dirichlet Allocation (LDA) is a popular topic modeling technique for discovery of hidden semantic architecture of text datasets, and plays a fundamental role in many machine learning applications.

BIG-bench Machine Learning Privacy Preserving

Cannot find the paper you are looking for? You can Submit a new open access paper.