OOD Detection
122 papers with code • 0 benchmarks • 2 datasets
Out of Distribution Detection: detecting instances that do not belong to the distribution the classifier has been trained on.
Benchmarks
These leaderboards are used to track progress in OOD Detection
Libraries
Use these libraries to find OOD Detection models and implementationsMost implemented papers
Deep Anomaly Detection with Outlier Exposure
We also analyze the flexibility and robustness of Outlier Exposure, and identify characteristics of the auxiliary dataset that improve performance.
Likelihood Ratios for Out-of-Distribution Detection
We propose a likelihood ratio method for deep generative models which effectively corrects for these confounding background statistics.
Detecting Out-of-Distribution Examples with In-distribution Examples and Gram Matrices
We find that characterizing activity patterns by Gram matrices and identifying anomalies in gram matrix values can yield high OOD detection rates.
Hierarchical VAEs Know What They Don't Know
Deep generative models have been demonstrated as state-of-the-art density estimators.
Likelihood Regret: An Out-of-Distribution Detection Score For Variational Auto-encoder
An important application of generative modeling should be the ability to detect out-of-distribution (OOD) samples by setting a threshold on the likelihood.
Improved Contrastive Divergence Training of Energy Based Models
Contrastive divergence is a popular method of training energy-based models, but is known to have difficulties with training stability.
A Simple Fix to Mahalanobis Distance for Improving Near-OOD Detection
Mahalanobis distance (MD) is a simple and popular post-processing method for detecting out-of-distribution (OOD) inputs in neural networks.
Input complexity and out-of-distribution detection with likelihood-based generative models
Likelihood-based generative models are a promising resource to detect out-of-distribution (OOD) inputs which could compromise the robustness or reliability of a machine learning system.
Generalized ODIN: Detecting Out-of-distribution Image without Learning from Out-of-distribution Data
Deep neural networks have attained remarkable performance when applied to data that comes from the same distribution as that of the training set, but can significantly degrade otherwise.
Entropic Out-of-Distribution Detection: Seamless Detection of Unknown Examples
In this paper, we argue that the unsatisfactory out-of-distribution (OOD) detection performance of neural networks is mainly due to the SoftMax loss anisotropy and propensity to produce low entropy probability distributions in disagreement with the principle of maximum entropy.