Attention over Self-attention:Intention-aware Re-ranking with Dynamic Transformer Encoders for Recommendation

14 Jan 2022  ·  Zhuoyi Lin, Sheng Zang, Rundong Wang, Zhu Sun, J. Senthilnath, Chi Xu, Chee-Keong Kwoh ·

Re-ranking models refine item recommendation lists generated by the prior global ranking model, which have demonstrated their effectiveness in improving the recommendation quality. However, most existing re-ranking solutions only learn from implicit feedback with a shared prediction model, which regrettably ignore inter-item relationships under diverse user intentions. In this paper, we propose a novel Intention-aware Re-ranking Model with Dynamic Transformer Encoder (RAISE), aiming to perform user-specific prediction for each individual user based on her intentions. Specifically, we first propose to mine latent user intentions from text reviews with an intention discovering module (IDM). By differentiating the importance of review information with a co-attention network, the latent user intention can be explicitly modeled for each user-item pair. We then introduce a dynamic transformer encoder (DTE) to capture user-specific inter-item relationships among item candidates by seamlessly accommodating the learned latent user intentions via IDM. As such, one can not only achieve more personalized recommendations but also obtain corresponding explanations by constructing RAISE upon existing recommendation engines. Empirical study on four public datasets shows the superiority of our proposed RAISE, with up to 13.95%, 9.60%, and 13.03% relative improvements evaluated by Precision@5, MAP@5, and NDCG@5 respectively.

PDF Abstract
No code implementations yet. Submit your code now

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods