Out of Distribution (OOD) Detection

224 papers with code • 3 benchmarks • 8 datasets

Out of Distribution (OOD) Detection is the task of detecting instances that do not belong to the distribution the classifier has been trained on. OOD data is often referred to as "unseen" data, as the model has not encountered it during training.

OOD detection is typically performed by training a model to distinguish between in-distribution (ID) data, which the model has seen during training, and OOD data, which it has not seen. This can be done using a variety of techniques, such as training a separate OOD detector, or modifying the model's architecture or loss function to make it more sensitive to OOD data.

Libraries

Use these libraries to find Out of Distribution (OOD) Detection models and implementations

Most implemented papers

Out of Distribution Detection via Neural Network Anchoring

llnl/amp 8 Jul 2022

Our goal in this paper is to exploit heteroscedastic temperature scaling as a calibration strategy for out of distribution (OOD) detection.

A Multi-Head Model for Continual Learning via Out-of-Distribution Replay

k-gyuhak/more 20 Aug 2022

Instead of using the saved samples in memory to update the network for previous tasks/classes in the existing approach, MORE leverages the saved samples to build a task specific classifier (adding a new classification head) without updating the network learned for previous tasks/classes.

Input complexity and out-of-distribution detection with likelihood-based generative models

anonconfsubaccount/tilted_prior ICLR 2020

Likelihood-based generative models are a promising resource to detect out-of-distribution (OOD) inputs which could compromise the robustness or reliability of a machine learning system.

Generalized ODIN: Detecting Out-of-distribution Image without Learning from Out-of-distribution Data

sayakpaul/Generalized-ODIN-TF CVPR 2020

Deep neural networks have attained remarkable performance when applied to data that comes from the same distribution as that of the training set, but can significantly degrade otherwise.

Entropic Out-of-Distribution Detection: Seamless Detection of Unknown Examples

dlmacedo/entropic-out-of-distribution-detection 7 Jun 2020

In this paper, we argue that the unsatisfactory out-of-distribution (OOD) detection performance of neural networks is mainly due to the SoftMax loss anisotropy and propensity to produce low entropy probability distributions in disagreement with the principle of maximum entropy.

Can Autonomous Vehicles Identify, Recover From, and Adapt to Distribution Shifts?

OATML/oatomobile ICML 2020

Out-of-training-distribution (OOD) scenarios are a common challenge of learning agents at deployment, typically leading to arbitrary deductions and poorly-informed decisions.

Certifiably Adversarially Robust Detection of Out-of-Distribution Data

Bitterwolf/GOOD NeurIPS 2020

Deep neural networks are known to be overconfident when applied to out-of-distribution (OOD) inputs which clearly do not belong to any class.

Out-of-Distribution Detection Using Union of 1-Dimensional Subspaces

zaeemzadeh/OOD CVPR 2021

In this paper, we argue that OOD samples can be detected more easily if the training data is embedded into a low-dimensional space, such that the embedded training samples lie on a union of 1-dimensional subspaces.

Semantically Coherent Out-of-Distribution Detection

Jingkang50/ICCV21_SCOOD ICCV 2021

The proposed UDG can not only enrich the semantic knowledge of the model by exploiting unlabeled data in an unsupervised manner, but also distinguish ID/OOD samples to enhance ID classification and OOD detection tasks simultaneously.

On the Out-of-distribution Generalization of Probabilistic Image Modelling

zmtomorrow/nelloc NeurIPS 2021

Out-of-distribution (OOD) detection and lossless compression constitute two problems that can be solved by the training of probabilistic models on a first dataset with subsequent likelihood evaluation on a second dataset, where data distributions differ.