Search Results for author: Zhipeng Di

Found 1 papers, 0 papers with code

A Novel Architecture Slimming Method for Network Pruning and Knowledge Distillation

no code implementations21 Feb 2022 Dongqi Wang, Shengyu Zhang, Zhipeng Di, Xin Lin, Weihua Zhou, Fei Wu

A common problem in both pruning and distillation is to determine compressed architecture, i. e., the exact number of filters per layer and layer configuration, in order to preserve most of the original model capacity.

Knowledge Distillation Model Compression +1

Cannot find the paper you are looking for? You can Submit a new open access paper.