no code implementations • 30 Jan 2024 • Sheng Li, Geng Yuan, Yawen Wu, Yue Dai, Chao Wu, Alex K. Jones, Jingtong Hu, Yanzhi Wang, Xulong Tang
Emerging applications, such as robot-assisted eldercare and object recognition, generally employ deep learning neural networks (DNNs) and naturally require: i) handling streaming-in inference requests and ii) adapting to possible deployment scenario changes.
no code implementations • 16 Jan 2024 • Ching-Hao Chiu, Yu-Jen Chen, Yawen Wu, Yiyu Shi, Tsung-Yi Ho
To overcome this, we propose a method enabling fair predictions for sensitive attributes during the testing phase without using such information during training.
no code implementations • 31 May 2023 • Dewen Zeng, Yawen Wu, Xinrong Hu, Xiaowei Xu, Jingtong Hu, Yiyu Shi
This paper presents a new way to identify additional positive pairs for BYOL, a state-of-the-art (SOTA) self-supervised learning framework, to improve its representation learning ability.
no code implementations • 2 Dec 2022 • Jiahe Shi, Yawen Wu, Dewen Zeng, Jun Tao, Jingtong Hu, Yiyu Shi
The ubiquity of edge devices has led to a growing amount of unlabeled data produced at the edge.
no code implementations • 25 Aug 2022 • Yue Tang, Yawen Wu, Peipei Zhou, Jingtong Hu
To enable W-TAL models to learn from a long, untrimmed streaming video, we propose an efficient video learning approach that can directly adapt to new environments.
Action Detection Weakly-supervised Temporal Action Localization +1
no code implementations • 24 Aug 2022 • Yawen Wu, Dewen Zeng, Zhepeng Wang, Yi Sheng, Lei Yang, Alaina J. James, Yiyu Shi, Jingtong Hu
Self-supervised learning (SSL) methods, contrastive learning (CL) and masked autoencoders (MAE), can leverage the unlabeled data to pre-train models, followed by fine-tuning with limited labels.
no code implementations • 23 Aug 2022 • Gelei Xu, Yawen Wu, Jingtong Hu, Yiyu Shi
The framework is divided into two stages: In the first in-FL stage, clients with different skin types are trained in a federated learning process to construct a global model for all skin types.
no code implementations • 7 Aug 2022 • Yawen Wu, Dewen Zeng, Zhepeng Wang, Yiyu Shi, Jingtong Hu
However, when adopting CL in FL, the limited data diversity on each site makes federated contrastive learning (FCL) ineffective.
no code implementations • 23 Apr 2022 • Yawen Wu, Dewen Zeng, Zhepeng Wang, Yiyu Shi, Jingtong Hu
However, in medical imaging analysis, each site may only have a limited amount of data and labels, which makes learning ineffective.
no code implementations • 4 Mar 2022 • Yawen Wu, Dewen Zeng, Xiaowei Xu, Yiyu Shi, Jingtong Hu
By pruning the parameters based on this importance difference, we can reduce the accuracy difference between the privileged group and the unprivileged group to improve fairness without a large accuracy drop.
no code implementations • 23 Feb 2022 • Yi Sheng, Junhuan Yang, Yawen Wu, Kevin Mao, Yiyu Shi, Jingtong Hu, Weiwen Jiang, Lei Yang
Results show that FaHaNa can identify a series of neural networks with higher fairness and accuracy on a dermatology dataset.
no code implementations • 14 Feb 2022 • Yawen Wu, Dewen Zeng, Zhepeng Wang, Yi Sheng, Lei Yang, Alaina J. James, Yiyu Shi, Jingtong Hu
The recently developed self-supervised learning approach, contrastive learning (CL), can leverage the unlabeled data to pre-train a model, after which the model is fine-tuned on limited labeled data for dermatological disease diagnosis.
no code implementations • 14 Feb 2022 • Yawen Wu, Zhepeng Wang, Dewen Zeng, Yiyu Shi, Jingtong Hu
To tackle this problem, we propose a data generation framework with two methods to improve CL training by joint sample generation and contrastive learning.
no code implementations • 21 Nov 2021 • Yawen Wu, Zhepeng Wang, Dewen Zeng, Meng Li, Yiyu Shi, Jingtong Hu
To tackle this problem, we propose a collaborative contrastive learning framework consisting of two approaches: feature fusion and neighborhood matching, by which a unified feature space among clients is learned for better data representations.
no code implementations • 29 Sep 2021 • Yawen Wu, Zhepeng Wang, Dewen Zeng, Meng Li, Yiyu Shi, Jingtong Hu
Federated learning (FL) enables distributed clients to learn a shared model for prediction while keeping the training data local on each client.
no code implementations • 29 Sep 2021 • Yawen Wu, Zhepeng Wang, Dewen Zeng, Yiyu Shi, Jingtong Hu
In this way, the main model learns to cluster hard positives by pulling the representations of similar yet distinct samples together, by which the representations of similar samples are well-clustered and better representations can be learned.
1 code implementation • 16 Jun 2021 • Dewen Zeng, Yawen Wu, Xinrong Hu, Xiaowei Xu, Haiyun Yuan, Meiping Huang, Jian Zhuang, Jingtong Hu, Yiyu Shi
The success of deep learning heavily depends on the availability of large labeled training sets.
no code implementations • 7 Jun 2021 • Yawen Wu, Zhepeng Wang, Dewen Zeng, Yiyu Shi, Jingtong Hu
After a model is deployed on edge devices, it is desirable for these devices to learn from unlabeled data to continuously improve accuracy.
no code implementations • 1 Jan 2021 • Yawen Wu, Zhepeng Wang, Dewen Zeng, Yiyu Shi, Jingtong Hu
In this paper, we propose a framework to automatically select the most representative data from unlabeled input stream on-the-fly, which only requires the use of a small data buffer for dynamic learning.
no code implementations • 7 Jul 2020 • Yawen Wu, Zhepeng Wang, Yiyu Shi, Jingtong Hu
For example, when training ResNet-110 on CIFAR-10, we achieve 68% computation saving while preserving full accuracy and 75% computation saving with a marginal accuracy loss of 1. 3%.
no code implementations • 23 Apr 2020 • Yawen Wu, Zhepeng Wang, Zhenge Jia, Yiyu Shi, Jingtong Hu
This work aims to enable persistent, event-driven sensing and decision capabilities for energy-harvesting (EH)-powered devices by deploying lightweight DNNs onto EH-powered devices.