Learning with coarse labels

4 papers with code • 4 benchmarks • 4 datasets

Learning fine-grained representation with coarsely-labelled dataset, which can significantly reduce the labelling cost. As a simple example, for the task of differentiation between different pets, we need a knowledgeable cat lover to distinguish between ‘British short’ and ‘Siamese’, but even a child annotator may help to discriminate between ‘cat’ and ‘non-cat’.

Most implemented papers

Weakly Supervised Representation Learning with Coarse Labels

idstcv/coins ICCV 2021

To mitigate this challenge, we propose an algorithm to learn the fine-grained patterns for the target task, when only its coarse-class labels are available.

Fine-grained Angular Contrastive Learning with Coarse Labels

guybuk/ANCOR CVPR 2021

A very practical example of C2FS is when the target classes are sub-classes of the training classes.

MaskCon: Masked Contrastive Learning for Coarse-Labelled Dataset

MrChenFeng/MaskCon_CVPR2023 CVPR 2023

More specifically, within the contrastive learning framework, for each sample our method generates soft-labels with the aid of coarse labels against other samples and another augmented view of the sample in question.

Weakly supervised segmentation of intracranial aneurysms using a novel 3D focal modulation UNet

HealthX-Lab/HGI-SAM 6 Aug 2023

In the paper, we propose FocalSegNet, a novel 3D focal modulation UNet, to detect an aneurysm and offer an initial, coarse segmentation of it from time-of-flight MRA image patches, which is further refined with a dense conditional random field (CRF) post-processing layer to produce a final segmentation map.