Search Results for author: Simiao Li

Found 2 papers, 0 papers with code

Knowledge Distillation with Multi-granularity Mixture of Priors for Image Super-Resolution

no code implementations3 Apr 2024 Simiao Li, Yun Zhang, Wei Li, Hanting Chen, Wenjia Wang, BingYi Jing, Shaohui Lin, Jie Hu

Knowledge distillation (KD) is a promising yet challenging model compression technique that transfers rich learning representations from a well-performing but cumbersome teacher model to a compact student model.

Image Super-Resolution Knowledge Distillation +1

Data Upcycling Knowledge Distillation for Image Super-Resolution

no code implementations25 Sep 2023 Yun Zhang, Wei Li, Simiao Li, Hanting Chen, Zhijun Tu, Wenjia Wang, BingYi Jing, Shaohui Lin, Jie Hu

Knowledge distillation (KD) compresses deep neural networks by transferring task-related knowledge from cumbersome pre-trained teacher models to compact student models.

Image Super-Resolution Knowledge Distillation +1

Cannot find the paper you are looking for? You can Submit a new open access paper.