Search Results for author: Xinwei Guan

Found 1 papers, 1 papers with code

Channel Distillation: Channel-Wise Attention for Knowledge Distillation

1 code implementation2 Jun 2020 Zaida Zhou, Chaoran Zhuge, Xinwei Guan, Wen Liu

Knowledge distillation is to transfer the knowledge from the data learned by the teacher network to the student network, so that the student has the advantage of less parameters and less calculations, and the accuracy is close to the teacher.

Knowledge Distillation

Cannot find the paper you are looking for? You can Submit a new open access paper.