Search Results for author: Chaoran Zhuge

Found 2 papers, 1 papers with code

Channel Distillation: Channel-Wise Attention for Knowledge Distillation

1 code implementation2 Jun 2020 Zaida Zhou, Chaoran Zhuge, Xinwei Guan, Wen Liu

Knowledge distillation is to transfer the knowledge from the data learned by the teacher network to the student network, so that the student has the advantage of less parameters and less calculations, and the accuracy is close to the teacher.

Knowledge Distillation

Attribute-guided Feature Extraction and Augmentation Robust Learning for Vehicle Re-identification

no code implementations13 May 2020 Chaoran Zhuge, Yujie Peng, Yadong Li, Jiangbo Ai, Junru Chen

Vehicle re-identification is one of the core technologies of intelligent transportation systems and smart cities, but large intra-class diversity and inter-class similarity poses great challenges for existing method.

Attribute Re-Ranking +1

Cannot find the paper you are looking for? You can Submit a new open access paper.