Search Results for author: Gaurav Pooniwala

Found 1 papers, 0 papers with code

A Selective Survey on Versatile Knowledge Distillation Paradigm for Neural Network Models

no code implementations30 Nov 2020 Jeong-Hoe Ku, Jihun Oh, YoungYoon Lee, Gaurav Pooniwala, SangJeong Lee

This paper aims to provide a selective survey about knowledge distillation(KD) framework for researchers and practitioners to take advantage of it for developing new optimized models in the deep neural network field.

Knowledge Distillation Model Compression +1

Cannot find the paper you are looking for? You can Submit a new open access paper.