Outlier Exposure with Confidence Control for Out-of-Distribution Detection

Deep neural networks have achieved great success in classification tasks during the last years. However, one major problem to the path towards artificial intelligence is the inability of neural networks to accurately detect samples from novel class distributions and therefore, most of the existent classification algorithms assume that all classes are known prior to the training stage. In this work, we propose a methodology for training a neural network that allows it to efficiently detect out-of-distribution (OOD) examples without compromising much of its classification accuracy on the test examples from known classes. We propose a novel loss function that gives rise to a novel method, Outlier Exposure with Confidence Control (OECC), which achieves superior results in OOD detection with OE both on image and text classification tasks without requiring access to OOD samples. Additionally, we experimentally show that the combination of OECC with state-of-the-art post-training OOD detection methods, like the Mahalanobis Detector (MD) and the Gramian Matrices (GM) methods, further improves their performance in the OOD detection task, demonstrating the potential of combining training and post-training methods for OOD detection.

PDF Abstract

Results from the Paper


 Ranked #1 on Out-of-Distribution Detection on CIFAR-100 vs SVHN (using extra training data)

     Get a GitHub badge
Task Dataset Model Metric Name Metric Value Global Rank Uses Extra
Training Data
Result Benchmark
Out-of-Distribution Detection 20 Newsgroups 2-layer GRUs + OECC AUROC 99.18 # 2
Out-of-Distribution Detection CIFAR-10 ResNet 34 + OECC+GM AUROC 99.7 # 3
Out-of-Distribution Detection CIFAR-100 WRN 40-2 + OECC FPR95 28.89 # 2
Out-of-Distribution Detection CIFAR-100 vs CIFAR-10 WRN 40-2 + OECC AUROC 78.7 # 11
AUPR 35.2 # 5
Out-of-Distribution Detection CIFAR-100 vs SVHN OECC + MD AUROC 98.7 # 1
Out-of-Distribution Detection CIFAR-10 vs CIFAR-100 Wide 40-2 + OECC AUPR 82.0 # 5
AUROC 94.9 # 8
Out-of-Distribution Detection ImageNet dogs vs ImageNet non-dogs ResNet 34 + OE AUROC 92.5 # 2
Out-of-Distribution Detection MS-1M vs. IJB-C ResNeXt 50 + OE AUROC 52.6 # 4

Methods