no code implementations • 30 Nov 2020 • Jeong-Hoe Ku, Jihun Oh, YoungYoon Lee, Gaurav Pooniwala, SangJeong Lee
This paper aims to provide a selective survey about knowledge distillation(KD) framework for researchers and practitioners to take advantage of it for developing new optimized models in the deep neural network field.