Search Results for author: Shikang Yu

Found 1 papers, 1 papers with code

Data-Free Knowledge Distillation via Feature Exchange and Activation Region Constraint

1 code implementation CVPR 2023 Shikang Yu, Jiachen Chen, Hu Han, Shuqiang Jiang

Therefore, we propose mSARC to assure the student network can imitate not only the logit output but also the spatial activation region of the teacher network in order to alleviate the influence of unwanted noises in diverse synthetic images on distillation learning.

Data Augmentation Data-free Knowledge Distillation +1

Cannot find the paper you are looking for? You can Submit a new open access paper.