Beyond Perceptual Distances: Rethinking Disparity Assessment for Out-of-Distribution Detection with Diffusion Models

16 Sep 2024  ·  Kun Fang, Qinghua Tao, Zuopeng Yang, Xiaolin Huang, Jie Yang ·

Out-of-Distribution (OoD) detection aims to justify whether a given sample is from the training distribution of the classifier-under-protection, i.e., In-Distribution (InD), or from OoD. Diffusion Models (DMs) are recently utilized in OoD detection by using the perceptual distances between the given image and its DM generation. DM-based methods bring fresh insights to the field, yet remain under-explored. In this work, we point out two main limitations in DM-based OoD detection methods: (i) the perceptual metrics on the disparities between the given sample and its generation are devised only at human-perceived levels, ignoring the abstract or high-level patterns that help better reflect the intrinsic disparities in distribution; (ii) only the raw image contents are taken to measure the disparities, while other representations, i.e., the features and probabilities from the classifier-under-protection, are easy to access at hand but are ignored. To this end, our proposed detection framework goes beyond the perceptual distances and looks into the deep representations from the classifier-under-protection with our novel metrics devised correspondingly, leading to more informative disparity assessments between InD and OoD. An anomaly-removal strategy is integrated to remove the abnormal OoD information in the generation, further enhancing the distinctiveness of disparities. Our work has demonstrated state-of-the-art detection performances among DM-based methods in extensive experiments.

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods