Search Results for author: Yucheol Cho

Found 1 papers, 0 papers with code

Robustness-Reinforced Knowledge Distillation with Correlation Distance and Network Pruning

no code implementations23 Nov 2023 Seonghak Kim, Gyeongdo Ham, Yucheol Cho, Daeshik Kim

The improvement in the performance of efficient and lightweight models (i. e., the student model) is achieved through knowledge distillation (KD), which involves transferring knowledge from more complex models (i. e., the teacher model).

Data Augmentation Knowledge Distillation +1

Cannot find the paper you are looking for? You can Submit a new open access paper.