1 code implementation • 7 Nov 2020 • Pengchao Han, Jihong Park, Shiqiang Wang, Yejun Liu
Knowledge distillation (KD) has enabled remarkable progress in model compression and knowledge transfer.
Data-free Knowledge Distillation Diversity +2