no code implementations • 23 Feb 2024 • Pengchao Han, Chao Huang, Geng Tian, Ming Tang, Xin Liu
We further extend the analysis to non-convex objectives and where some clients may be unavailable during training.
no code implementations • 20 Dec 2023 • Pengchao Han, Shiqiang Wang, Yang Jiao, Jianwei Huang
Toward the challenges, we propose an online problem approximation to reduce the problem complexity and optimize the resources to balance the needs of model training and inference.
no code implementations • 28 Nov 2023 • Pengchao Han, Xingyan Shi, Jianwei Huang
In this paper, we propose Federated knowledge distillation enabled by Adversarial Learning (FedAL) to address the data heterogeneity among clients.
no code implementations • 13 Oct 2023 • Chao Huang, Pengchao Han, Jianwei Huang
To this end, we propose an alternating algorithm that iteratively updates each learner's training data size and reward.
1 code implementation • 16 May 2023 • Xucong Wang, Pengchao Han, Lei Guo
Specifically, we introduce a Distillation with Reverse Guidance (DRG) method that considers different levels of information extracted by the model, including edge, shape, and detail of the input data, to construct a more informative teacher.
no code implementations • 15 Mar 2023 • Bing Luo, Xiaomin Ouyang, Peng Sun, Pengchao Han, Ningning Ding, Jianwei Huang
With the rapid advancement of 5G networks, billions of smart Internet of Things (IoT) devices along with an enormous amount of data are generated at the network edge.
1 code implementation • 7 Nov 2020 • Pengchao Han, Jihong Park, Shiqiang Wang, Yejun Liu
Knowledge distillation (KD) has enabled remarkable progress in model compression and knowledge transfer.
no code implementations • 14 Jan 2020 • Pengchao Han, Shiqiang Wang, Kin K. Leung
Then, with the goal of minimizing the overall training time, we propose a novel online learning formulation and algorithm for automatically determining the near-optimal communication and computation trade-off that is controlled by the degree of gradient sparsity.