Search Results for author: Xiongfeng Zheng

Found 1 papers, 0 papers with code

An Empirical Study of Uniform-Architecture Knowledge Distillation in Document Ranking

no code implementations8 Feb 2023 Xubo Qin, Xiyuan Liu, Xiongfeng Zheng, Jie Liu, Yutao Zhu

Specifically, when the student models are in cross-encoder architecture, a pairwise loss of hard labels is critical for training student models, whereas the distillation objectives of intermediate Transformer layers may hurt performance.

Document Ranking Knowledge Distillation

Cannot find the paper you are looking for? You can Submit a new open access paper.