Search Results for author: Gibeom Park

Found 1 papers, 1 papers with code

Refine Myself by Teaching Myself: Feature Refinement via Self-Knowledge Distillation

1 code implementation CVPR 2021 Mingi Ji, Seungjae Shin, Seunghyun Hwang, Gibeom Park, Il-Chul Moon

Knowledge distillation is a method of transferring the knowledge from a pretrained complex teacher model to a student model, so a smaller network can replace a large teacher network at the deployment stage.

Data Augmentation object-detection +4

Cannot find the paper you are looking for? You can Submit a new open access paper.