OpenMatch: Open-Set Semi-supervised Learning with Open-set Consistency Regularization

NeurIPS 2021  ·  Kuniaki Saito, Donghyun Kim, Kate Saenko ·

Semi-supervised learning (SSL) is an effective means to leverage unlabeled data to improve a model’s performance. Typical SSL methods like FixMatch assume that labeled and unlabeled data share the same label space. However, in practice, unlabeled data can contain categories unseen in the labeled set, i.e., outliers, which can significantly harm the performance of SSL algorithms. To address this problem, we propose a novel Open-set Semi-Supervised Learning (OSSL) approach called OpenMatch.Learning representations of inliers while rejecting outliers is essential for the success of OSSL. To this end, OpenMatch unifies FixMatch with novelty detection based on one-vs-all (OVA) classifiers. The OVA-classifier outputs the confidence score of a sample being an inlier, providing a threshold to detect outliers. Another key contribution is an open-set soft-consistency regularization loss, which enhances the smoothness of the OVA-classifier with respect to input transformations and greatly improves outlier detection. \ours achieves state-of-the-art performance on three datasets, and even outperforms a fully supervised model in detecting outliers unseen in unlabeled data on CIFAR10. The code is available at \url{https://github.com/VisionLearningGroup/OP_Match}.

PDF Abstract NeurIPS 2021 PDF NeurIPS 2021 Abstract
Task Dataset Model Metric Name Metric Value Global Rank Benchmark
Semi-Supervised Image Classification CIFAR-10, 100 Labels (OpenSet, 6/4) OpenMatch Accuracy 92.9 # 2
Semi-Supervised Image Classification CIFAR-10, 400 Labels (OpenSet, 6/4) OpenMatch Accuracy 94.1 # 2
Semi-Supervised Image Classification CIFAR-10, 50 Labels (OpenSet, 6/4) OpenMatch Accuracy 89.6 # 2

Methods