Explainable Recommendation
31 papers with code • 2 benchmarks • 2 datasets
Most implemented papers
KGAT: Knowledge Graph Attention Network for Recommendation
To provide more accurate, diverse, and explainable recommendation, it is compulsory to go beyond modeling user-item interactions and take side information into account.
Learning Heterogeneous Knowledge Base Embeddings for Explainable Recommendation
Specifically, we propose a knowledge-base representation learning framework to embed heterogeneous entities for recommendation, and based on the embedded knowledge base, a soft matching algorithm is proposed to generate personalized explanations for the recommended items.
Counterfactual Explainable Recommendation
Technically, for each item recommended to each user, CountER formulates a joint optimization problem to generate minimal changes on the item aspects so as to create a counterfactual item, such that the recommendation decision on the counterfactual item is reversed.
Tower Bridge Net (TB-Net): Bidirectional Knowledge Graph Aware Embedding Propagation for Explainable Recommender Systems
Recently, neural networks based models have been widely used for recommender systems (RS).
Explainable Recommendation via Multi-Task Learning in Opinionated Text Data
Explaining automatically generated recommendations allows users to make more informed and accurate decisions about which results to utilize, and therefore improves their satisfaction.
Jointly Learning Explainable Rules for Recommendation with Knowledge Graph
The framework encourages two modules to complement each other in generating effective and explainable recommendation: 1) inductive rules, mined from item-centric knowledge graphs, summarize common multi-hop relational patterns for inferring different item associations and provide human-readable explanation for model prediction; 2) recommendation module can be augmented by induced rules and thus have better generalization ability dealing with the cold-start issue.
Reinforcement Knowledge Graph Reasoning for Explainable Recommendation
To this end, we propose a method called Policy-Guided Path Reasoning (PGPR), which couples recommendation and interpretability by providing actual paths in a knowledge graph.
Hybrid Deep Embedding for Recommendations with Dynamic Aspect-Level Explanations
Particularly, as the aspect preference/quality of users/items is learned automatically, HDE is able to capture the impact of aspects that are not mentioned in reviews of a user or an item.
Synthesizing Aspect-Driven Recommendation Explanations from Reviews
Explanations help to make sense of recommendations, increasing the likelihood of adoption.
Path-Based Reasoning over Heterogeneous Networks for Recommendation via Bidirectional Modeling
Despite their effectiveness, these models are often confronted with the following limitations: (1) Most prior path-based reasoning models only consider the influence of the predecessors on the subsequent nodes when modeling the sequences, and ignore the reciprocity between the nodes in a path; (2) The weights of nodes in the same path instance are usually assumed to be constant, whereas varied weights of nodes can bring more flexibility and lead to expressive modeling; (3) User-item interactions are noisy, but they are often indiscriminately exploited.