Search Results for author: Zuhao Ge

Found 2 papers, 0 papers with code

Sampling to Distill: Knowledge Transfer from Open-World Data

no code implementations31 Jul 2023 Yuzheng Wang, Zhaoyu Chen, Jie Zhang, Dingkang Yang, Zuhao Ge, Yang Liu, Siao Liu, Yunquan Sun, Wenqiang Zhang, Lizhe Qi

Then, we introduce a low-noise representation to alleviate the domain shifts and build a structured relationship of multiple data examples to exploit data knowledge.

Data-free Knowledge Distillation Transfer Learning

Explicit and Implicit Knowledge Distillation via Unlabeled Data

no code implementations17 Feb 2023 Yuzheng Wang, Zuhao Ge, Zhaoyu Chen, Xian Liu, Chuangjia Ma, Yunquan Sun, Lizhe Qi

Data-free knowledge distillation is a challenging model lightweight task for scenarios in which the original dataset is not available.

Data-free Knowledge Distillation

Cannot find the paper you are looking for? You can Submit a new open access paper.