Search Results for author: Yechan Kim

Found 2 papers, 2 papers with code

TSPipe: Learn from Teacher Faster with Pipelines

1 code implementation ICML 2022 Hwijoon Lim, Yechan Kim, Sukmin Yun, Jinwoo Shin, Dongsu Han

The teacher-student (TS) framework, training a (student) network by utilizing an auxiliary superior (teacher) network, has been adopted as a popular training paradigm in many machine learning schemes, since the seminal work---Knowledge distillation (KD) for model compression and transfer learning.

Knowledge Distillation Self-Supervised Learning +1

Imbalanced Image Classification with Complement Cross Entropy

2 code implementations4 Sep 2020 Yechan Kim, Younkwan Lee, Moongu Jeon

Recently, deep learning models have achieved great success in computer vision applications, relying on large-scale class-balanced datasets.

Classification General Classification +2

Cannot find the paper you are looking for? You can Submit a new open access paper.