Isotropy Maximization Loss and Entropic Score: Accurate, Fast, Efficient, Scalable, and Turnkey Neural Networks Out-of-Distribution Detection Based on The Principle of Maximum Entropy

Current out-of-distribution (OOD) detection approaches require cumbersome procedures that add undesired side effects to the solution. In this paper, we argue that the low OOD detection performance of neural networks is due to cross-entropy SoftMax loss anisotropy and extreme propensity to produce low entropy (high confidence) posterior probability distributions in frontal disagreement with the Principle of Maximum Entropy. Consequently, we propose IsoMax, a loss that is isotropic (distance-based) and produces high entropy (low confidence) posterior probability distributions despite still relying on cross-entropy minimization. Additionally, we propose a speedy Entropic Score for OOD detection. IsoMax loss works as a seamless SoftMax loss drop-in replacement that keeps the overall solution accurate, fast, efficient, scalable, and turnkey. Our experiments indeed confirmed that neural networks OOD detection performance may be extremely improved without relying on techniques such as adversarial training or validation, data augmentation, ensembles methods, generative approaches, model architectural changes, metric learning, or additional classifiers or regressions. The results also showed that our straightforward approach is competitive against state-of-the-art solutions besides avoiding previous methods undesired drawbacks.

PDF Abstract

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods