Out-of-Distribution Detection
326 papers with code • 50 benchmarks • 22 datasets
Detect out-of-distribution or anomalous examples.
Libraries
Use these libraries to find Out-of-Distribution Detection models and implementationsDatasets
Latest papers with no code
Gradient-Regularized Out-of-Distribution Detection
In this work, we propose the idea of leveraging the information embedded in the gradient of the loss function during training to enable the network to not only learn a desired OOD score for each sample but also to exhibit similar behavior in a local neighborhood around each sample.
Toward a Realistic Benchmark for Out-of-Distribution Detection
Deep neural networks are increasingly used in a wide range of technologies and services, but remain highly susceptible to out-of-distribution (OOD) samples, that is, drawn from a different distribution than the original training set.
Epistemic Uncertainty Quantification For Pre-trained Neural Network
Specifically, we propose a gradient-based approach to assess epistemic uncertainty, analyzing the gradients of outputs relative to model parameters, and thereby indicating necessary model adjustments to accurately represent the inputs.
On the Learnability of Out-of-distribution Detection
Based on this observation, we next give several necessary and sufficient conditions to characterize the learnability of OOD detection in some practical scenarios.
Focused Active Learning for Histopathological Image Classification
The lack of precise uncertainty estimations leads to the acquisition of images with a low informative value.
CT-3DFlow : Leveraging 3D Normalizing Flows for Unsupervised Detection of Pathological Pulmonary CT scans
Unsupervised pathology detection can be implemented by training a model on healthy data only and measuring the deviation from the training set upon inference, for example with CNN-based feature extraction and one-class classifiers, or reconstruction-score-based methods such as AEs, GANs and Diffusion models.
Enabling Uncertainty Estimation in Iterative Neural Networks
Turning pass-through network architectures into iterative ones, which use their own output as input, is a well-known approach for boosting performance.
Out-of-Distribution Detection via Deep Multi-Comprehension Ensemble
Our experimental results demonstrate the superior performance of the MC Ensemble strategy in OOD detection compared to both the naive Deep Ensemble method and a standalone model of comparable size.
Hypothesis-Driven Deep Learning for Out of Distribution Detection
Given a trained DNN and some input, we first feed the input through the DNN and compute an ensemble of OoD metrics, which we term latent responses.
Out-of-Distribution Detection Using Peer-Class Generated by Large Language Model
Out-of-distribution (OOD) detection is a critical task to ensure the reliability and security of machine learning models deployed in real-world applications.