Search Results for author: Zhenyan Hou

Found 1 papers, 0 papers with code

A New Training Framework for Deep Neural Network

no code implementations12 Mar 2021 Zhenyan Hou, Wenxuan Fan

Knowledge distillation is the process of transferring the knowledge from a large model to a small model.

Knowledge Distillation

Cannot find the paper you are looking for? You can Submit a new open access paper.