Introducing Self-Attention to Target Attentive Graph Neural Networks

4 Jul 2021  ·  Sai Mitheran, Abhinav Java, Surya Kant Sahu, Arshad Shaikh ·

Session-based recommendation systems suggest relevant items to users by modeling user behavior and preferences using short-term anonymous sessions. Existing methods leverage Graph Neural Networks (GNNs) that propagate and aggregate information from neighboring nodes i.e., local message passing. Such graph-based architectures have representational limits, as a single sub-graph is susceptible to overfit the sequential dependencies instead of accounting for complex transitions between items in different sessions. We propose a new technique that leverages a Transformer in combination with a target attentive GNN. This allows richer representations to be learnt, which translates to empirical performance gains in comparison to a vanilla target attentive GNN. Our experimental results and ablation show that our proposed method is competitive with the existing methods on real-world benchmark datasets, improving on graph-based hypotheses. Code is available at https://github.com/The-Learning-Machines/SBR

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper
Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Session-Based Recommendations Diginetica TAGNN++ MRR@20 17.93 # 8
Hit@20 51.86 # 7
Session-Based Recommendations yoochoose1/64 TAGNN++ MRR@20 31.57 # 5
HR@20 71.91 # 3

Methods