OOD Detection

122 papers with code • 0 benchmarks • 2 datasets

Out of Distribution Detection: detecting instances that do not belong to the distribution the classifier has been trained on.


Use these libraries to find OOD Detection models and implementations

Most implemented papers

Deep Anomaly Detection with Outlier Exposure

hendrycks/outlier-exposure ICLR 2019

We also analyze the flexibility and robustness of Outlier Exposure, and identify characteristics of the auxiliary dataset that improve performance.

Likelihood Ratios for Out-of-Distribution Detection

google-research/google-research NeurIPS 2019

We propose a likelihood ratio method for deep generative models which effectively corrects for these confounding background statistics.

Detecting Out-of-Distribution Examples with In-distribution Examples and Gram Matrices

VectorInstitute/gram-ood-detection 28 Dec 2019

We find that characterizing activity patterns by Gram matrices and identifying anomalies in gram matrix values can yield high OOD detection rates.

Hierarchical VAEs Know What They Don't Know

vlievin/biva-pytorch 16 Feb 2021

Deep generative models have been demonstrated as state-of-the-art density estimators.

Likelihood Regret: An Out-of-Distribution Detection Score For Variational Auto-encoder

XavierXiao/Likelihood-Regret NeurIPS 2020

An important application of generative modeling should be the ability to detect out-of-distribution (OOD) samples by setting a threshold on the likelihood.

Improved Contrastive Divergence Training of Energy Based Models

yilundu/improved_contrastive_divergence 2 Dec 2020

Contrastive divergence is a popular method of training energy-based models, but is known to have difficulties with training stability.

A Simple Fix to Mahalanobis Distance for Improving Near-OOD Detection

google/uncertainty-baselines 16 Jun 2021

Mahalanobis distance (MD) is a simple and popular post-processing method for detecting out-of-distribution (OOD) inputs in neural networks.

Input complexity and out-of-distribution detection with likelihood-based generative models

anonconfsubaccount/tilted_prior ICLR 2020

Likelihood-based generative models are a promising resource to detect out-of-distribution (OOD) inputs which could compromise the robustness or reliability of a machine learning system.

Generalized ODIN: Detecting Out-of-distribution Image without Learning from Out-of-distribution Data

sayakpaul/Generalized-ODIN-TF CVPR 2020

Deep neural networks have attained remarkable performance when applied to data that comes from the same distribution as that of the training set, but can significantly degrade otherwise.

Entropic Out-of-Distribution Detection: Seamless Detection of Unknown Examples

dlmacedo/entropic-out-of-distribution-detection 7 Jun 2020

In this paper, we argue that the unsatisfactory out-of-distribution (OOD) detection performance of neural networks is mainly due to the SoftMax loss anisotropy and propensity to produce low entropy probability distributions in disagreement with the principle of maximum entropy.