Search Results for author: Xuliang Duan

Found 1 papers, 0 papers with code

Education distillation:getting student models to learn in shcools

no code implementations23 Nov 2023 Ling Feng, Danyang Li, Tianhao Wu, Xuliang Duan

Specifically, it is proposed to take fragmented student models divided from the complete student model as lower-grade models.

Incremental Learning Knowledge Distillation +1

Cannot find the paper you are looking for? You can Submit a new open access paper.