no code implementations • 21 Jan 2024 • Jichang Li, Guanbin Li, Yizhou Yu
However, existing SSDA work fails to make full use of label information from both source and target domains for feature alignment across domains, resulting in label mismatch in the label space during model testing.
Semi-supervised Domain Adaptation Unsupervised Domain Adaptation
no code implementations • 21 Jan 2024 • Jichang Li, Guanbin Li, Yizhou Yu
Once the graph has been refined, Adaptive Betweenness Clustering is introduced to facilitate semantic transfer by using across-domain betweenness clustering and within-domain betweenness clustering, thereby propagating semantic label information from labeled samples across domains to unlabeled target data.
1 code implementation • 19 Dec 2023 • Jichang Li, Guanbin Li, Hui Cheng, Zicheng Liao, Yizhou Yu
However, these prior methods do not learn noise filters by exploiting knowledge across all clients, leading to sub-optimal and inferior noise filtering performance and thus damaging training stability.
1 code implementation • CVPR 2023 • Duojun Huang, Jichang Li, Weikai Chen, Junshi Huang, Zhenhua Chai, Guanbin Li
To accommodate active learning and domain adaption, the two naturally different tasks, in a collaborative framework, we advocate that a customized learning strategy for the target data is the key to the success of ADA solutions.
1 code implementation • 5 Aug 2022 • Jichang Li, Guanbin Li, Feng Liu, Yizhou Yu
Specifically, our method is divided into two steps: 1) Neighborhood Collective Noise Verification to separate all training samples into a clean or noisy subset, 2) Neighborhood Collective Label Correction to relabel noisy samples, and then auxiliary techniques are used to assist further model optimization.
2 code implementations • CVPR 2021 • Jichang Li, Guanbin Li, Yemin Shi, Yizhou Yu
Pseudo labeling expands the number of ``labeled" samples in each class in the target domain, and thus produces a more robust and powerful cluster core for each class to facilitate adversarial learning.