Search Results for author: Shiya Luo

Found 2 papers, 1 papers with code

Customizing Synthetic Data for Data-Free Student Learning

1 code implementation10 Jul 2023 Shiya Luo, Defang Chen, Can Wang

Existing works generally synthesize data from the pre-trained teacher model to replace the original training data for student learning.

Data-free Knowledge Distillation

Knowledge Distillation with Deep Supervision

no code implementations16 Feb 2022 Shiya Luo, Defang Chen, Can Wang

Knowledge distillation aims to enhance the performance of a lightweight student model by exploiting the knowledge from a pre-trained cumbersome teacher model.

Knowledge Distillation Transfer Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.