1 code implementation • 22 Jan 2025 • Wei Tang, Yin-Fang Yang, Zhaofei Wang, Weijia Zhang, Min-Ling Zhang
Multi-instance partial-label learning (MIPL) is an emerging learning framework where each training sample is represented as a multi-instance bag associated with a candidate label set.
1 code implementation • 19 Oct 2024 • Xin Liu, Weijia Zhang, Min-Ling Zhang
In this paper, we introduce HACSurv, a survival analysis method that learns Hierarchical Archimedean Copulas structures and cause-specific survival functions from data with competing risks.
1 code implementation • 8 Oct 2024 • Zi-Hao Zhou, Siyuan Fang, Zi-Jing Zhou, Tong Wei, Yuanyu Wan, Min-Ling Zhang
By progressively estimating the underlying label distribution and optimizing its alignment with model predictions, we tackle the diverse distribution of unlabeled data in real-world scenarios.
1 code implementation • 29 Sep 2024 • Tong Wei, Hao-Tian Li, Chun-Shu Li, Jiang-Xin Shi, Yu-Feng Li, Min-Ling Zhang
The proposed framework establishes a noisy label detector by learning positive and negative textual prompts for each class.
1 code implementation • 26 Aug 2024 • Wei Tang, Weijia Zhang, Min-Ling Zhang
To achieve this, we extract the label information embedded in both candidate and non-candidate label sets, incorporating the intrinsic properties of the label space.
no code implementations • 18 Aug 2024 • Xin Liu, Weijia Zhang, Min-Ling Zhang
From the standard MIL assumptions, we propose a surprisingly simple yet effective instance-based MIL method for WSI classification (FocusMIL) based on max-pooling and forward amortized variational inference.
1 code implementation • 19 Jun 2024 • Kai Gan, Tong Wei, Min-Ling Zhang
While long-tailed semi-supervised learning (LTSSL) has received tremendous attention in many real-world classification problems, existing LTSSL algorithms typically assume that the class distributions of labeled and unlabeled data are almost identical.
no code implementations • CVPR 2024 • Dong-Dong Wu, Chilin Fu, Weichang Wu, Wenwen Xia, Xiaolu Zhang, Jun Zhou, Min-Ling Zhang
With the escalating complexity and investment cost of training deep neural networks safeguarding them from unauthorized usage and intellectual property theft has become imperative.
1 code implementation • 14 Dec 2023 • Tong Wei, Bo-Lin Wang, Min-Ling Zhang
The main difficulty lies in distinguishing OOD data from samples belonging to the tail classes, as the ability of a classifier to detect OOD instances is not strongly correlated with its accuracy on the in-distribution classes.
1 code implementation • 21 Sep 2023 • Bo Ye, Kai Gan, Tong Wei, Min-Ling Zhang
In open-world semi-supervised learning, a machine learning model is tasked with uncovering novel categories from unlabeled data while maintaining performance on seen categories from labeled data.
Novel Class Discovery
Open-World Semi-Supervised Learning
+1
no code implementations • 31 Aug 2023 • Yu Shi, Dong-Dong Wu, Xin Geng, Min-Ling Zhang
This is known as Unreliable Partial Label Learning (UPLL) that introduces an additional complexity due to the inherent unreliability and ambiguity of partial labels, often resulting in a sub-optimal performance with existing methods.
1 code implementation • AAAI 2023 • Xin Cheng, Deng-Bao Wang, Lei Feng, Min-Ling Zhang, Bo An
Our proposed methods are theoretically grounded and can be compatible with any models, optimizers, and losses.
1 code implementation • 17 May 2023 • Yuheng Jia, Chongjie Si, Min-Ling Zhang
complementary labels), which accurately indicates a set of labels that do not belong to a sample.
no code implementations • 10 May 2023 • Hanwen Deng, Weijia Zhang, Min-Ling Zhang
Label noise widely exists in large-scale datasets and significantly degenerates the performances of deep learning algorithms.
no code implementations • 27 Mar 2023 • Zhaofei Wang, Weijia Zhang, Min-Ling Zhang
However, since such approaches only utilize the highest score proposal and discard the potentially useful information from other proposals, their independent MIL backbone often limits models to salient parts of an object or causes them to detect only one object per class.
no code implementations • 25 Feb 2023 • Yi Gao, Miao Xu, Min-Ling Zhang
Moreover, theoretical findings reveal that calculating a transition matrix from label correlations in \textit{multi-labeled CLL} (ML-CLL) needs multi-labeled data, while this is unavailable for ML-CLL.
1 code implementation • CVPR 2023 • Deng-Bao Wang, Lanqing Li, Peilin Zhao, Pheng-Ann Heng, Min-Ling Zhang
It has been recently found that models trained with mixup also perform well on uncertainty calibration.
1 code implementation • 18 Dec 2022 • Wei Tang, Weijia Zhang, Min-Ling Zhang
MIPLGP first assigns each instance with a candidate label set in an augmented label space, then transforms the candidate label set into a logarithmic space to yield the disambiguated and continuous labels via an exclusive disambiguation strategy, and last induces a model based on the Gaussian processes.
4 code implementations • 8 Oct 2022 • Tong Wei, Zhen Mao, Jiang-Xin Shi, Yu-Feng Li, Min-Ling Zhang
Multi-label learning has attracted significant attention from both academic and industry field in recent decades.
2 code implementations • 9 Jul 2022 • Chongjie Si, Yuheng Jia, Ran Wang, Min-Ling Zhang, Yanghe Feng, Chongxiao Qu
Previous methods capture the high-order label correlations mainly by transforming the label matrix to a latent label space with low-rank matrix factorization.
1 code implementation • 1 Jun 2022 • Ning Xu, Congyu Qiao, Jiaqi Lv, Xin Geng, Min-Ling Zhang
To cope with the challenge, we investigate single-positive multi-label learning (SPMLL) where each example is annotated with only one relevant label, and show that one can successfully learn a theoretically grounded multi-label classifier for the problem.
1 code implementation • 25 Feb 2022 • Weijia Zhang, Xuanhui Zhang, Han-Wen Deng, Min-Ling Zhang
Multi-instance learning (MIL) deals with objects represented as bags of instances and can predict instance labels from bag-level supervision.
Multiple Instance Learning
Out-of-Distribution Generalization
+1
no code implementations • NeurIPS 2021 • Deng-Bao Wang, Lei Feng, Min-Ling Zhang
Capturing accurate uncertainty quantification of the prediction from deep neural networks is important in many real-world decision-making applications.
1 code implementation • NeurIPS 2021 • Ning Xu, Congyu Qiao, Xin Geng, Min-Ling Zhang
In this paper, we consider instance-dependent PLL and assume that each example is associated with a latent label distribution constituted by the real number of each label, representing the degree to each label describing the feature.
no code implementations • 22 Oct 2021 • Tong Wei, Jiang-Xin Shi, Yu-Feng Li, Min-Ling Zhang
Deep neural networks have been shown to be very powerful methods for many supervised learning tasks.
no code implementations • 1 Apr 2021 • Hao Yang, Youzhi Jin, Ziyin Li, Deng-Bao Wang, Lei Miao, Xin Geng, Min-Ling Zhang
During the training process, DLT records the loss value of each sample and calculates dynamic loss thresholds.
no code implementations • NeurIPS 2020 • Wei Wang, Min-Ling Zhang
After that, maximum margin formulation is introduced to jointly enable the induction of predictive model and the estimation of labeling confidence over unlabeled data.
1 code implementation • 7 2018 • Jing Wang, Min-Ling Zhang
Partial label (PL) learning aims to induce a multi-class classifier from training examples where each of them is associated with a set of candidate labels, among which only one is valid.
1 code implementation • International Joint Conferences on Artificial Intelligence 2014 • Min-Ling Zhang, Lei Wu
Existing approaches learn from multi-label data by manipulating with identical feature set, i. e. the very instance representation of each example is employed in the discrimination processes of all class labels.