Search Results for author: Ji-Yue Wang

Found 1 papers, 0 papers with code

Extending Label Smoothing Regularization with Self-Knowledge Distillation

no code implementations11 Sep 2020 Ji-Yue Wang, Pei Zhang, Wen-feng Pang, Jie Li

The experiment results confirm that the TC can help LsrKD and MrKD to boost training, especially on the networks they are failed.

Self-Knowledge Distillation

Cannot find the paper you are looking for? You can Submit a new open access paper.