Search Results for author: Shitao Bei

Found 1 papers, 1 papers with code

Up to 100$\times$ Faster Data-free Knowledge Distillation

2 code implementations12 Dec 2021 Gongfan Fang, Kanya Mo, Xinchao Wang, Jie Song, Shitao Bei, Haofei Zhang, Mingli Song

At the heart of our approach is a novel strategy to reuse the shared common features in training data so as to synthesize different data instances.

Data-free Knowledge Distillation

Cannot find the paper you are looking for? You can Submit a new open access paper.