2 code implementations • 17 Feb 2023 • Jiaxi Tang, Yoel Drori, Daryl Chang, Maheswaran Sathiamoorthy, Justin Gilmer, Li Wei, Xinyang Yi, Lichan Hong, Ed H. Chi
Recommender systems play an important role in many content platforms.
1 code implementation • 11 Aug 2020 • Jiaxi Tang, Hongyi Wen, Ke Wang
Recommender systems play an important role in modern information and e-commerce applications.
no code implementations • 10 Feb 2020 • Jiaxi Tang, Rakesh Shivanna, Zhe Zhao, Dong Lin, Anima Singh, Ed H. Chi, Sagar Jain
Knowledge Distillation (KD) is a model-agnostic technique to improve model quality while having a fixed capacity budget.
no code implementations • 22 Feb 2019 • Jiaxi Tang, Francois Belletti, Sagar Jain, Minmin Chen, Alex Beutel, Can Xu, Ed H. Chi
Our approach employs a mixture of models, each with a different temporal range.
5 code implementations • 19 Sep 2018 • Jiaxi Tang, Ke Wang
Top-$N$ sequential recommendation models each user as a sequence of items interacted in the past and aims to predict top-$N$ ranked items that a user will likely interact in a `near future'.
1 code implementation • 19 Sep 2018 • Jiaxi Tang, Ke Wang
We propose a KD technique for learning to rank problems, called \emph{ranking distillation (RD)}.