PANDA: Adapting Pretrained Features for Anomaly Detection and Segmentation

CVPR 2021  ·  Tal Reiss, Niv Cohen, Liron Bergman, Yedid Hoshen ·

Anomaly detection methods require high-quality features. In recent years, the anomaly detection community has attempted to obtain better features using advances in deep self-supervised feature learning. Surprisingly, a very promising direction, using pretrained deep features, has been mostly overlooked. In this paper, we first empirically establish the perhaps expected, but unreported result, that combining pretrained features with simple anomaly detection and segmentation methods convincingly outperforms, much more complex, state-of-the-art methods. In order to obtain further performance gains in anomaly detection, we adapt pretrained features to the target distribution. Although transfer learning methods are well established in multi-class classification problems, the one-class classification (OCC) setting is not as well explored. It turns out that naive adaptation methods, which typically work well in supervised learning, often result in catastrophic collapse (feature deterioration) and reduce performance in OCC settings. A popular OCC method, DeepSVDD, advocates using specialized architectures, but this limits the adaptation performance gain. We propose two methods for combating collapse: i) a variant of early stopping that dynamically learns the stopping iteration ii) elastic regularization inspired by continual learning. Our method, PANDA, outperforms the state-of-the-art in the OCC, outlier exposure and anomaly segmentation settings by large margins.

PDF Abstract CVPR 2021 PDF CVPR 2021 Abstract

Results from the Paper


Task Dataset Model Metric Name Metric Value Global Rank Uses Extra
Training Data
Result Benchmark
Anomaly Detection Cats-and-Dogs Self-Supervised One-class SVM, RBF kernel ROC AUC 51.7 # 3
Anomaly Detection Cats-and-Dogs PANDA ROC AUC 97.3 # 1
Anomaly Detection Cats-and-Dogs PANDA-OE ROC AUC 94.5 # 2
Anomaly Detection Cats-and-Dogs Self-Supervised DeepSVDD ROC AUC 50.5 # 4
Anomaly Detection DIOR Self-Supervised DeepSVDD ROC AUC 70 # 4
Anomaly Detection DIOR Self-Supervised One-class SVM, RBF kernel ROC AUC 70.7 # 3
Anomaly Detection DIOR PANDA ROC AUC 94.3 # 2
Anomaly Detection DIOR PANDA-OE ROC AUC 95.9 # 1
Anomaly Detection Fashion-MNIST Self-Supervised DeepSVDD ROC AUC 84.8 # 11
Anomaly Detection Fashion-MNIST PANDA-OE ROC AUC 91.8 # 10
Anomaly Detection Fashion-MNIST PANDA ROC AUC 95.6 # 2
Anomaly Detection Fashion-MNIST Self-Supervised One-class SVM, RBF kernel ROC AUC 92.8 # 6
Anomaly Detection Hyper-Kvasir Dataset PANDA AUC 0.937 # 3
Anomaly Detection One-class CIFAR-10 Self-Supervised DeepSVDD AUROC 64.8 # 32
Anomaly Detection One-class CIFAR-10 Self-Supervised One-class SVM, RBF kernel AUROC 64.7 # 33
Anomaly Detection One-class CIFAR-10 PANDA-OE AUROC 98.9 # 3
Anomaly Detection One-class CIFAR-10 PANDA AUROC 96.2 # 8
Anomaly Detection One-class CIFAR-100 PANDA-OE AUROC 97.3 # 2
Anomaly Detection One-class CIFAR-100 Self-Supervised Multi-Head RotNet AUROC 80.1 # 11
Anomaly Detection One-class CIFAR-100 Self-Supervised DeepSVDD AUROC 67 # 13
Anomaly Detection One-class CIFAR-100 Self-Supervised One-class SVM, RBF kernel AUROC 62.6 # 14
Anomaly Detection One-class CIFAR-100 PANDA AUROC 94.1 # 4

Methods