Generalized ODIN: Detecting Out-of-distribution Image without Learning from Out-of-distribution Data

26 Feb 2020Yen-Chang HsuYilin ShenHongxia JinZsolt Kira

Deep neural networks have attained remarkable performance when applied to data that comes from the same distribution as that of the training set, but can significantly degrade otherwise. Therefore, detecting whether an example is out-of-distribution (OoD) is crucial to enable a system that can reject such samples or alert users... (read more)

PDF Abstract


No code implementations yet. Submit your code now


Results from the Paper

  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.