FOOD: Fast Out-Of-Distribution Detector

16 Aug 2020  ·  Guy Amit, Moshe Levy, Ishai Rosenberg, Asaf Shabtai, Yuval Elovici ·

Deep neural networks (DNNs) perform well at classifying inputs associated with the classes they have been trained on, which are known as in distribution inputs. However, out-of-distribution (OOD) inputs pose a great challenge to DNNs and consequently represent a major risk when DNNs are implemented in safety-critical systems. Extensive research has been performed in the domain of OOD detection. However, current state-of-the-art methods for OOD detection suffer from at least one of the following limitations: (1) increased inference time - this limits existing methods' applicability to many real-world applications, and (2) the need for OOD training data - such data can be difficult to acquire and may not be representative enough, thus limiting the ability of the OOD detector to generalize. In this paper, we propose FOOD -- Fast Out-Of-Distribution detector -- an extended DNN classifier capable of efficiently detecting OOD samples with minimal inference time overhead. Our architecture features a DNN with a final Gaussian layer combined with the log likelihood ratio statistical test and an additional output neuron for OOD detection. Instead of using real OOD data, we use a novel method to craft artificial OOD samples from in-distribution data, which are used to train our OOD detector neuron. We evaluate FOOD's detection performance on the SVHN, CIFAR-10, and CIFAR-100 datasets. Our results demonstrate that in addition to achieving state-of-the-art performance, FOOD is fast and applicable to real-world applications.

PDF Abstract

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods