1 code implementation • COLING 2022 • Xiaoqin Chang, Sophia Yat Mei Lee, Suyang Zhu, Shoushan Li, Guodong Zhou
Knowledge distillation is an effective method to transfer knowledge from a large pre-trained teacher model to a compacted student model.
Ensemble Learning Knowledge Distillation +2