1 code implementation • NeurIPS 2023 • Wenxuan Ma, Shuang Li, Lincan Cai, Jingxuan Kang
Therefore, to achieve data-efficient learning, researchers typically explore approaches that can leverage more related or unlabeled data without necessitating additional manual labeling efforts, such as Semi-Supervised Learning (SSL), Transfer Learning (TL), and Data Augmentation (DA).
1 code implementation • ICCV 2023 • Wenxuan Ma, Shuang Li, Jinming Zhang, Chi Harold Liu, Jingxuan Kang, Yulin Wang, Gao Huang
To address this issue, this paper presents a novel approach that seeks to leverage linguistic knowledge for data-efficient visual learning.
1 code implementation • 24 Dec 2022 • Wenxuan Ma, Xing Yan, Kun Zhang
A tree is built upon giving the training data, whose leaf nodes represent different regions where region-specific neural networks are trained to predict both the mean and the variance for quantifying uncertainty.
1 code implementation • 26 Nov 2022 • Xing Yan, Yonghua Su, Wenxuan Ma
We seek an adaptive balance between the structural integrity and the flexibility for $\mathbb{P}(\mathbf{y}|\mathbf{X}=x)$, while Gaussian assumption results in a lack of flexibility for real data and highly flexible approaches (e. g., estimating the quantiles separately without a distribution structure) inevitably have drawbacks and may not lead to good generalization.
1 code implementation • 2 Aug 2022 • Wenxuan Ma, Jinming Zhang, Shuang Li, Chi Harold Liu, Yulin Wang, Wei Li
To alleviate these issues, we propose to simultaneously conduct feature alignment in two individual spaces focusing on different domains, and create for each space a domain-oriented classifier tailored specifically for that domain.
no code implementations • 25 Nov 2021 • Wenxuan Ma, Jinming Zhang, Shuang Li, Chi Harold Liu, Yulin Wang, Wei Li
Unsupervised Domain Adaptation (UDA) aims to transfer knowledge from a labeled source domain to an unlabeled target domain.
1 code implementation • CVPR 2021 • Shuang Li, Jinming Zhang, Wenxuan Ma, Chi Harold Liu, Wei Li
Domain adaptation (DA) enables knowledge transfer from a labeled source domain to an unlabeled target domain by reducing the cross-domain distribution discrepancy.