Search Results for author: Hailin Zhang

Found 4 papers, 3 papers with code

Knowledge Distillation with the Reused Teacher Classifier

1 code implementation26 Mar 2022 Defang Chen, Jian-Ping Mei, Hailin Zhang, Can Wang, Yan Feng, Chun Chen

Knowledge distillation aims to compress a powerful yet cumbersome teacher model into a lightweight student model without much sacrifice of performance.

Knowledge Distillation

Confidence-Aware Multi-Teacher Knowledge Distillation

1 code implementation30 Dec 2021 Hailin Zhang, Defang Chen, Can Wang

Knowledge distillation is initially introduced to utilize additional supervision from a single teacher model for the student model training.

Knowledge Distillation Transfer Learning

Coexistence under hierarchical resource exploitation: the role of R*-preemption tradeoff

no code implementations22 Aug 2019 Man Qi, Niv DeMalach, Tao Sun, Hailin Zhang

Thus, we developed an extension of resource competition theory to investigate partial and total preemption (in the latter, the preemptor is unaffected by species with lower preemption rank).

Cannot find the paper you are looking for? You can Submit a new open access paper.