Search Results for author: Zihe Dong

Found 2 papers, 1 papers with code

Knowledge Distillation via Instance-level Sequence Learning

no code implementations21 Jun 2021 Haoran Zhao, Xin Sun, Junyu Dong, Zihe Dong, Qiong Li

Recently, distillation approaches are suggested to extract general knowledge from a teacher network to guide a student network.

General Knowledge Knowledge Distillation

Highlight Every Step: Knowledge Distillation via Collaborative Teaching

1 code implementation23 Jul 2019 Haoran Zhao, Xin Sun, Junyu Dong, Changrui Chen, Zihe Dong

Knowledge distillation aims to train a compact student network by transferring knowledge from a larger pre-trained teacher model.

Knowledge Distillation

Cannot find the paper you are looking for? You can Submit a new open access paper.