Search Results for author: Qisen Yang

Found 3 papers, 1 papers with code

Efficient Knowledge Distillation from Model Checkpoints

1 code implementation12 Oct 2022 Chaofei Wang, Qisen Yang, Rui Huang, Shiji Song, Gao Huang

Knowledge distillation is an effective approach to learn compact models (students) with the supervision of large and strong models (teachers).

Knowledge Distillation

Fine-Grained Few Shot Learning with Foreground Object Transformation

no code implementations13 Sep 2021 Chaofei Wang, Shiji Song, Qisen Yang, Xiang Li, Gao Huang

As a data augmentation method, FOT can be conveniently applied to any existing few shot learning algorithm and greatly improve its performance on FG-FSL tasks.

Data Augmentation Few-Shot Learning +1

CAM-loss: Towards Learning Spatially Discriminative Feature Representations

no code implementations ICCV 2021 Chaofei Wang, Jiayu Xiao, Yizeng Han, Qisen Yang, Shiji Song, Gao Huang

The backbone of traditional CNN classifier is generally considered as a feature extractor, followed by a linear layer which performs the classification.

Few-Shot Learning Knowledge Distillation +1

Cannot find the paper you are looking for? You can Submit a new open access paper.