Out of Distribution (OOD) Detection

250 papers with code • 3 benchmarks • 9 datasets

Out of Distribution (OOD) Detection is the task of detecting instances that do not belong to the distribution the classifier has been trained on. OOD data is often referred to as "unseen" data, as the model has not encountered it during training.

OOD detection is typically performed by training a model to distinguish between in-distribution (ID) data, which the model has seen during training, and OOD data, which it has not seen. This can be done using a variety of techniques, such as training a separate OOD detector, or modifying the model's architecture or loss function to make it more sensitive to OOD data.


Use these libraries to find Out of Distribution (OOD) Detection models and implementations

Most implemented papers

Deep Anomaly Detection with Outlier Exposure

hendrycks/outlier-exposure ICLR 2019

We also analyze the flexibility and robustness of Outlier Exposure, and identify characteristics of the auxiliary dataset that improve performance.

Detecting Out-of-Distribution Examples with In-distribution Examples and Gram Matrices

VectorInstitute/gram-ood-detection 28 Dec 2019

We find that characterizing activity patterns by Gram matrices and identifying anomalies in gram matrix values can yield high OOD detection rates.

Likelihood Ratios for Out-of-Distribution Detection

google-research/google-research NeurIPS 2019

We propose a likelihood ratio method for deep generative models which effectively corrects for these confounding background statistics.

Improved Contrastive Divergence Training of Energy Based Models

yilundu/improved_contrastive_divergence 2 Dec 2020

Contrastive divergence is a popular method of training energy-based models, but is known to have difficulties with training stability.

Hierarchical VAEs Know What They Don't Know

vlievin/biva-pytorch 16 Feb 2021

Deep generative models have been demonstrated as state-of-the-art density estimators.

A Simple Fix to Mahalanobis Distance for Improving Near-OOD Detection

google/uncertainty-baselines 16 Jun 2021

Mahalanobis distance (MD) is a simple and popular post-processing method for detecting out-of-distribution (OOD) inputs in neural networks.

Generalized Out-of-Distribution Detection: A Survey

jingkang50/openood 21 Oct 2021

In this survey, we first present a unified framework called generalized OOD detection, which encompasses the five aforementioned problems, i. e., AD, ND, OSR, OOD detection, and OD.

OpenOOD: Benchmarking Generalized Out-of-Distribution Detection

jingkang50/openood 13 Oct 2022

Out-of-distribution (OOD) detection is vital to safety-critical machine learning applications and has thus been extensively studied, with a plethora of methods developed in the literature.

SSD: A Unified Framework for Self-Supervised Outlier Detection

inspire-group/SSD ICLR 2021

We demonstrate that SSD outperforms most existing detectors based on unlabeled data by a large margin.

MUAD: Multiple Uncertainties for Autonomous Driving, a benchmark for multiple uncertainty types and tasks

ENSTA-U2IS-AI/torch-uncertainty 2 Mar 2022

However, disentangling the different types and sources of uncertainty is non trivial for most datasets, especially since there is no ground truth for uncertainty.