no code implementations • 23 Jun 2022 • Jinhong Lin
Neural networks have achieved impressive performance for data in the distribution which is the same as the training set but can produce an overconfident incorrect result for the data these networks have never seen.
Out-of-Distribution Detection Out of Distribution (OOD) Detection
no code implementations • 4 Dec 2021 • Jinhong Lin, Zhaoyang Li
Knowledge distillation aims at transferring the knowledge from a large teacher model to a small student model with great improvements of the performance of the student model.