Search Results for author: Jiangtao Zhang

Found 4 papers, 1 papers with code

Generalization Matters: Loss Minima Flattening via Parameter Hybridization for Efficient Online Knowledge Distillation

no code implementations CVPR 2023 Tianli Zhang, Mengqi Xue, Jiangtao Zhang, Haofei Zhang, Yu Wang, Lechao Cheng, Jie Song, Mingli Song

Most existing online knowledge distillation(OKD) techniques typically require sophisticated modules to produce diverse knowledge for improving students' generalization ability.

Knowledge Distillation

Cannot find the paper you are looking for? You can Submit a new open access paper.