Multi-Complementary and Unlabeled Learning for Arbitrary Losses and Models

13 Jan 2020Yuzhou CaoShuqi LiuYitian Xu

A weakly-supervised learning framework named as complementary-label learning has been proposed recently, where each sample is equipped with a single complementary label that denotes one of the classes the sample does not belong to. However, the existing complementary-label learning methods cannot learn from the easily accessible unlabeled samples and samples with multiple complementary labels, which are more informative... (read more)

PDF Abstract

Code


No code implementations yet. Submit your code now
TASK DATASET MODEL METRIC NAME METRIC VALUE GLOBAL RANK RESULT BENCHMARK
Image Classification Kuzushiji-MNIST linear/flexible model Accuracy 79.90 # 10
Image Classification Kuzushiji-MNIST FWD Accuracy 79.5 # 11

Methods used in the Paper


METHOD TYPE
🤖 No Methods Found Help the community by adding them if they're not listed; e.g. Deep Residual Learning for Image Recognition uses ResNet