2 code implementations • 15 Nov 2022 • Yu Wang, Xin Li, Shengzhao Wen, Fukui Yang, Wanping Zhang, Gang Zhang, Haocheng Feng, Junyu Han, Errui Ding
In this paper, we focus on the compression of DETR with knowledge distillation.
General Knowledge Knowledge Distillation