1 code implementation • 2 Jun 2020 • Zaida Zhou, Chaoran Zhuge, Xinwei Guan, Wen Liu
Knowledge distillation is to transfer the knowledge from the data learned by the teacher network to the student network, so that the student has the advantage of less parameters and less calculations, and the accuracy is close to the teacher.