Search Results for author: Dongju Kim

Found 1 papers, 1 papers with code

Relational Knowledge Distillation

3 code implementations CVPR 2019 Wonpyo Park, Dongju Kim, Yan Lu, Minsu Cho

Knowledge distillation aims at transferring knowledge acquired in one model (a teacher) to another model (a student) that is typically smaller.

Knowledge Distillation Metric Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.