Search Results for author: Ruicheng Li

Found 1 papers, 0 papers with code

Accelerating Large Scale Knowledge Distillation via Dynamic Importance Sampling

no code implementations3 Dec 2018 Minghan Li, Tanli Zuo, Ruicheng Li, Martha White, Wei-Shi Zheng

Knowledge distillation is an effective technique that transfers knowledge from a large teacher model to a shallow student.

Knowledge Distillation Machine Translation +2

Cannot find the paper you are looking for? You can Submit a new open access paper.