Search Results for author: Min-Ling Zhang

Found 21 papers, 13 papers with code

EAT: Towards Long-Tailed Out-of-Distribution Detection

1 code implementation14 Dec 2023 Tong Wei, Bo-Lin Wang, Min-Ling Zhang

The main difficulty lies in distinguishing OOD data from samples belonging to the tail classes, as the ability of a classifier to detect OOD instances is not strongly correlated with its accuracy on the in-distribution classes.

Long-tail Learning Out-of-Distribution Detection +1

Bridging the Gap: Learning Pace Synchronization for Open-World Semi-Supervised Learning

1 code implementation21 Sep 2023 Bo Ye, Kai Gan, Tong Wei, Min-Ling Zhang

In open-world semi-supervised learning, a machine learning model is tasked with uncovering novel categories from unlabeled data while maintaining performance on seen categories from labeled data.

Novel Class Discovery Open-World Semi-Supervised Learning +1

Robust Representation Learning for Unreliable Partial Label Learning

no code implementations31 Aug 2023 Yu Shi, Dong-Dong Wu, Xin Geng, Min-Ling Zhang

This is known as Unreliable Partial Label Learning (UPLL) that introduces an additional complexity due to the inherent unreliability and ambiguity of partial labels, often resulting in a sub-optimal performance with existing methods.

Contrastive Learning Partial Label Learning +2

Partial-Label Regression

1 code implementation AAAI 2023 Xin Cheng, Deng-Bao Wang, Lei Feng, Min-Ling Zhang, Bo An

Our proposed methods are theoretically grounded and can be compatible with any models, optimizers, and losses.

Partial Label Learning regression +1

Complementary Classifier Induced Partial Label Learning

1 code implementation17 May 2023 Yuheng Jia, Chongjie Si, Min-Ling Zhang

complementary labels), which accurately indicates a set of labels that do not belong to a sample.

Partial Label Learning valid

Rethinking the Value of Labels for Instance-Dependent Label Noise Learning

no code implementations10 May 2023 Hanwen Deng, Weijia Zhang, Min-Ling Zhang

Label noise widely exists in large-scale datasets and significantly degenerates the performances of deep learning algorithms.

Representation Learning

Transformer-based Multi-Instance Learning for Weakly Supervised Object Detection

no code implementations27 Mar 2023 Zhaofei Wang, Weijia Zhang, Min-Ling Zhang

However, since such approaches only utilize the highest score proposal and discard the potentially useful information from other proposals, their independent MIL backbone often limits models to salient parts of an object or causes them to detect only one object per class.

Object object-detection +1

Complementary to Multiple Labels: A Correlation-Aware Correction Approach

no code implementations25 Feb 2023 Yi Gao, Miao Xu, Min-Ling Zhang

Moreover, theoretical findings reveal that calculating a transition matrix from label correlations in \textit{multi-labeled CLL} (ML-CLL) needs multi-labeled data, while this is unavailable for ML-CLL.

Binary Classification

On the Pitfall of Mixup for Uncertainty Calibration

1 code implementation CVPR 2023 Deng-Bao Wang, Lanqing Li, Peilin Zhao, Pheng-Ann Heng, Min-Ling Zhang

It has been recently found that models trained with mixup also perform well on uncertainty calibration.

Multi-Instance Partial-Label Learning: Towards Exploiting Dual Inexact Supervision

1 code implementation18 Dec 2022 Wei Tang, Weijia Zhang, Min-Ling Zhang

MIPLGP first assigns each instance with a candidate label set in an augmented label space, then transforms the candidate label set into a logarithmic space to yield the disambiguated and continuous labels via an exclusive disambiguation strategy, and last induces a model based on the Gaussian processes.

Gaussian Processes Partial Label Learning

A Survey on Extreme Multi-label Learning

4 code implementations8 Oct 2022 Tong Wei, Zhen Mao, Jiang-Xin Shi, Yu-Feng Li, Min-Ling Zhang

Multi-label learning has attracted significant attention from both academic and industry field in recent decades.

Multi-Label Learning

Multi-label Classification with High-rank and High-order Label Correlations

2 code implementations9 Jul 2022 Chongjie Si, Yuheng Jia, Ran Wang, Min-Ling Zhang, Yanghe Feng, Chongxiao Qu

Previous methods capture the high-order label correlations mainly by transforming the label matrix to a latent label space with low-rank matrix factorization.

Common Sense Reasoning Multi-Label Classification +1

One Positive Label is Sufficient: Single-Positive Multi-Label Learning with Label Enhancement

1 code implementation1 Jun 2022 Ning Xu, Congyu Qiao, Jiaqi Lv, Xin Geng, Min-Ling Zhang

To cope with the challenge, we investigate single-positive multi-label learning (SPMLL) where each example is annotated with only one relevant label, and show that one can successfully learn a theoretically grounded multi-label classifier for the problem.

Multi-Label Learning

Rethinking Calibration of Deep Neural Networks: Do Not Be Afraid of Overconfidence

no code implementations NeurIPS 2021 Deng-Bao Wang, Lei Feng, Min-Ling Zhang

Capturing accurate uncertainty quantification of the prediction from deep neural networks is important in many real-world decision-making applications.

Decision Making Uncertainty Quantification

Instance-Dependent Partial Label Learning

1 code implementation NeurIPS 2021 Ning Xu, Congyu Qiao, Xin Geng, Min-Ling Zhang

In this paper, we consider instance-dependent PLL and assume that each example is associated with a latent label distribution constituted by the real number of each label, representing the degree to each label describing the feature.

Partial Label Learning Weakly-supervised Learning

Prototypical Classifier for Robust Class-Imbalanced Learning

no code implementations22 Oct 2021 Tong Wei, Jiang-Xin Shi, Yu-Feng Li, Min-Ling Zhang

Deep neural networks have been shown to be very powerful methods for many supervised learning tasks.

Learning with noisy labels

Learning from Noisy Labels via Dynamic Loss Thresholding

no code implementations1 Apr 2021 Hao Yang, Youzhi Jin, Ziyin Li, Deng-Bao Wang, Lei Miao, Xin Geng, Min-Ling Zhang

During the training process, DLT records the loss value of each sample and calculates dynamic loss thresholds.

Semi-Supervised Partial Label Learning via Confidence-Rated Margin Maximization

no code implementations NeurIPS 2020 Wei Wang, Min-Ling Zhang

After that, maximum margin formulation is introduced to jointly enable the induction of predictive model and the estimation of labeling confidence over unlabeled data.

Partial Label Learning valid

Towards Mitigating the Class-Imbalance Problem for Partial Label Learning

1 code implementation 7 2018 Jing Wang, Min-Ling Zhang

Partial label (PL) learning aims to induce a multi-class classifier from training examples where each of them is associated with a set of candidate labels, among which only one is valid.

Partial Label Learning valid

LIFT : Multi-Label Learning with Label-Specific Features

1 code implementation International Joint Conferences on Artificial Intelligence 2014 Min-Ling Zhang, Lei Wu

Existing approaches learn from multi-label data by manipulating with identical feature set, i. e. the very instance representation of each example is employed in the discrimination processes of all class labels.

Clustering Multi-Label Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.