1 code implementation • CVPR 2021 • Xing Dai, Zeren Jiang, Zhao Wu, Yiping Bao, Zhicheng Wang, Si Liu, Erjin Zhou
In recent years, knowledge distillation has been proved to be an effective solution for model compression.
Knowledge Distillation Model Compression +4