Search Results for author: An Qin

Found 1 papers, 1 papers with code

Large-scale Knowledge Distillation with Elastic Heterogeneous Computing Resources

1 code implementation14 Jul 2022 Ji Liu, daxiang dong, Xi Wang, An Qin, Xingjian Li, Patrick Valduriez, Dejing Dou, dianhai yu

Although more layers and more parameters generally improve the accuracy of the models, such big models generally have high computational complexity and require big memory, which exceed the capacity of small devices for inference and incurs long training time.

Knowledge Distillation

Cannot find the paper you are looking for? You can Submit a new open access paper.