no code implementations • 12 Jul 2022 • Tao Liu, Xi Yang, Chenshu Chen
As a promising approach in model compression, knowledge distillation improves the performance of a compact model by transferring the knowledge from a cumbersome one.
Knowledge Distillation Model Compression +2