Out of Distribution (OOD) Detection

231 papers with code • 3 benchmarks • 8 datasets

Out of Distribution (OOD) Detection is the task of detecting instances that do not belong to the distribution the classifier has been trained on. OOD data is often referred to as "unseen" data, as the model has not encountered it during training.

OOD detection is typically performed by training a model to distinguish between in-distribution (ID) data, which the model has seen during training, and OOD data, which it has not seen. This can be done using a variety of techniques, such as training a separate OOD detector, or modifying the model's architecture or loss function to make it more sensitive to OOD data.

Libraries

Use these libraries to find Out of Distribution (OOD) Detection models and implementations

Most implemented papers

On the Practicality of Deterministic Epistemic Uncertainty

google/uncertainty-baselines 1 Jul 2021

We find that, while DUMs scale to realistic vision tasks and perform well on OOD detection, the practicality of current methods is undermined by poor calibration under distributional shifts.

Semantically Coherent Out-of-Distribution Detection

Jingkang50/ICCV21_SCOOD ICCV 2021

The proposed UDG can not only enrich the semantic knowledge of the model by exploiting unlabeled data in an unsupervised manner, but also distinguish ID/OOD samples to enhance ID classification and OOD detection tasks simultaneously.

On the Out-of-distribution Generalization of Probabilistic Image Modelling

zmtomorrow/nelloc NeurIPS 2021

Out-of-distribution (OOD) detection and lossless compression constitute two problems that can be solved by the training of probabilistic models on a first dataset with subsequent likelihood evaluation on a second dataset, where data distributions differ.

Trustworthy Long-Tailed Classification

lblaoke/tlc CVPR 2022

To address these issues, we propose a Trustworthy Long-tailed Classification (TLC) method to jointly conduct classification and uncertainty estimation to identify hard samples in a multi-expert framework.

Back to the Basics: Revisiting Out-of-Distribution Detection Baselines

cleanlab/cleanlab 7 Jul 2022

We study simple methods for out-of-distribution (OOD) image detection that are compatible with any already trained classifier, relying on only its predictions or learned representations.

Fine-Tuning Deteriorates General Textual Out-of-Distribution Detection by Distorting Task-Agnostic Features

lancopku/gnome 30 Jan 2023

We find that: (1) no existing method behaves well in both settings; (2) fine-tuning PLMs on in-distribution data benefits detecting semantic shifts but severely deteriorates detecting non-semantic shifts, which can be attributed to the distortion of task-agnostic features.

Beyond AUROC & co. for evaluating out-of-distribution detection performance

glhr/beyond-auroc 26 Jun 2023

While there has been a growing research interest in developing out-of-distribution (OOD) detection methods, there has been comparably little discussion around how these methods should be evaluated.

Density-based Feasibility Learning with Normalizing Flows for Introspective Robotic Assembly

DLR-RM/GRACE 3 Jul 2023

Machine Learning (ML) models in Robotic Assembly Sequence Planning (RASP) need to be introspective on the predicted solutions, i. e. whether they are feasible or not, to circumvent potential efficiency degradation.

VI-OOD: A Unified Representation Learning Framework for Textual Out-of-distribution Detection

liam0949/llm-ood 9 Apr 2024

Out-of-distribution (OOD) detection plays a crucial role in ensuring the safety and reliability of deep neural networks in various applications.

Rethinking Out-of-Distribution Detection for Reinforcement Learning: Advancing Methods for Evaluation and Detection

linasnas/dexter 10 Apr 2024

In this paper, we study the problem of out-of-distribution (OOD) detection in RL, which focuses on identifying situations at test time that RL agents have not encountered in their training environments.