no code implementations • 21 Jun 2021 • Haoran Zhao, Xin Sun, Junyu Dong, Zihe Dong, Qiong Li
Recently, distillation approaches are suggested to extract general knowledge from a teacher network to guide a student network.
1 code implementation • 23 Jul 2019 • Haoran Zhao, Xin Sun, Junyu Dong, Changrui Chen, Zihe Dong
Knowledge distillation aims to train a compact student network by transferring knowledge from a larger pre-trained teacher model.