no code implementations • 23 Nov 2023 • Ling Feng, Danyang Li, Tianhao Wu, Xuliang Duan
Specifically, it is proposed to take fragmented student models divided from the complete student model as lower-grade models.
Incremental Learning Knowledge Distillation +1