Search Results for author: Shikun Li

Found 7 papers, 6 papers with code

M3D: Dataset Condensation by Minimizing Maximum Mean Discrepancy

3 code implementations26 Dec 2023 Hansong Zhang, Shikun Li, Pengju Wang, Dan Zeng, Shiming Ge

Nowadays, optimization-oriented methods have been the primary method in the field of dataset condensation for achieving SOTA results.

Dataset Condensation

Coupled Confusion Correction: Learning from Crowds with Sparse Annotations

2 code implementations12 Dec 2023 Hansong Zhang, Shikun Li, Dan Zeng, Chenggang Yan, Shiming Ge

Moreover, we cluster the ``annotator groups'' who share similar expertise so that their confusion matrices could be corrected together.

Multi-Label Noise Transition Matrix Estimation with Label Correlations: Theory and Algorithm

1 code implementation22 Sep 2023 Shikun Li, Xiaobo Xia, Hansong Zhang, Shiming Ge, Tongliang Liu

However, estimating multi-label noise transition matrices remains a challenging task, as most existing estimators in noisy multi-class learning rely on anchor points and accurate fitting of noisy class posteriors, which is hard to satisfy in noisy multi-label learning.

Multi-Label Learning

Transferring Annotator- and Instance-dependent Transition Matrix for Learning from Crowds

1 code implementation5 Jun 2023 Shikun Li, Xiaobo Xia, Jiankang Deng, Shiming Ge, Tongliang Liu

In real-world crowd-sourcing scenarios, noise transition matrices are both annotator- and instance-dependent.

Transfer Learning

Trustable Co-label Learning from Multiple Noisy Annotators

1 code implementation8 Mar 2022 Shikun Li, Tongliang Liu, Jiyong Tan, Dan Zeng, Shiming Ge

This raises the following important question: how can we effectively use a small amount of trusted data to facilitate robust classifier learning from multiple annotators?

Selective-Supervised Contrastive Learning with Noisy Labels

1 code implementation CVPR 2022 Shikun Li, Xiaobo Xia, Shiming Ge, Tongliang Liu

In the selection process, by measuring the agreement between learned representations and given labels, we first identify confident examples that are exploited to build confident pairs.

Contrastive Learning Learning with noisy labels +1

Student Network Learning via Evolutionary Knowledge Distillation

no code implementations23 Mar 2021 Kangkai Zhang, Chunhui Zhang, Shikun Li, Dan Zeng, Shiming Ge

Inspired by that, we propose an evolutionary knowledge distillation approach to improve the transfer effectiveness of teacher knowledge.

Knowledge Distillation Transfer Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.