Out-of-Distribution Detection
326 papers with code • 50 benchmarks • 22 datasets
Detect out-of-distribution or anomalous examples.
Libraries
Use these libraries to find Out-of-Distribution Detection models and implementationsDatasets
Most implemented papers
Hierarchical VAEs Know What They Don't Know
Deep generative models have been demonstrated as state-of-the-art density estimators.
MOS: Towards Scaling Out-of-distribution Detection for Large Semantic Space
Detecting out-of-distribution (OOD) inputs is a central challenge for safely deploying machine learning models in the real world.
Open-set Label Noise Can Improve Robustness Against Inherent Label Noise
Learning with noisy labels is a practically challenging problem in weakly supervised learning.
A Gentle Introduction to Conformal Prediction and Distribution-Free Uncertainty Quantification
Conformal prediction is a user-friendly paradigm for creating statistically rigorous uncertainty sets/intervals for the predictions of such models.
Using Self-Supervised Learning Can Improve Model Robustness and Uncertainty
Self-supervision provides effective representations for downstream tasks without requiring labels.
Natural Adversarial Examples
We also curate an adversarial out-of-distribution detection dataset called ImageNet-O, which is the first out-of-distribution detection dataset created for ImageNet models.
Scaling Out-of-Distribution Detection for Real-World Settings
We conduct extensive experiments in these more realistic settings for out-of-distribution detection and find that a surprisingly simple detector based on the maximum logit outperforms prior methods in all the large-scale multi-class, multi-label, and segmentation tasks, establishing a simple new baseline for future work.
Training Normalizing Flows with the Information Bottleneck for Competitive Generative Classification
In this work, firstly, we develop the theory and methodology of IB-INNs, a class of conditional normalizing flows where INNs are trained using the IB objective: Introducing a small amount of {\em controlled} information loss allows for an asymptotically exact formulation of the IB, while keeping the INN's generative capabilities intact.
A Benchmark of Medical Out of Distribution Detection
However it is unclear which OoDD method should be used in practice.
Masksembles for Uncertainty Estimation
Our central intuition is that there is a continuous spectrum of ensemble-like models of which MC-Dropout and Deep Ensembles are extreme examples.