Search Results for author: Shipeng Fu

Found 1 papers, 0 papers with code

Interactive Knowledge Distillation

no code implementations3 Jul 2020 Shipeng Fu, Zhen Li, Jun Xu, Ming-Ming Cheng, Zitao Liu, Xiaomin Yang

Knowledge distillation is a standard teacher-student learning framework to train a light-weight student network under the guidance of a well-trained large teacher network.

Image Classification Knowledge Distillation

Cannot find the paper you are looking for? You can Submit a new open access paper.