1 code implementation • CVPR 2022 • Borui Zhao, Quan Cui, RenJie Song, Yiyu Qiu, Jiajun Liang
To provide a novel viewpoint to study logit distillation, we reformulate the classical KD loss into two parts, i. e., target class knowledge distillation (TCKD) and non-target class knowledge distillation (NCKD).