Search Results for author: SangJeong Lee

Found 3 papers, 0 papers with code

A Selective Survey on Versatile Knowledge Distillation Paradigm for Neural Network Models

no code implementations30 Nov 2020 Jeong-Hoe Ku, Jihun Oh, YoungYoon Lee, Gaurav Pooniwala, SangJeong Lee

This paper aims to provide a selective survey about knowledge distillation(KD) framework for researchers and practitioners to take advantage of it for developing new optimized models in the deep neural network field.

Knowledge Distillation Model Compression +1

Weight Equalizing Shift Scaler-Coupled Post-training Quantization

no code implementations13 Aug 2020 Jihun Oh, SangJeong Lee, Meejeong Park, Pooni Walagaurav, Kiseok Kwon

As a result, our proposed method achieved a top-1 accuracy of 69. 78% ~ 70. 96% in MobileNets and showed robust performance in varying network models and tasks, which is competitive to channel-wise quantization results.

Quantization

Cannot find the paper you are looking for? You can Submit a new open access paper.