1 code implementation • 21 Nov 2022 • Zhen Tian, Ting Bai, Zibin Zhang, Zhiyuan Xu, Kangyi Lin, Ji-Rong Wen, Wayne Xin Zhao
Some recent knowledge distillation based methods transfer knowledge from complex teacher models to shallow student models for accelerating the online model inference.
1 code implementation • 28 Oct 2022 • Yanyan Shen, Lifan Zhao, Weiyu Cheng, Zibin Zhang, Wenwen Zhou, Kangyi Lin
Specifically, we employ a shared predictor to infer basis user preferences, which acquires global preference knowledge from the interactions of different users.