|TREND||DATASET||BEST METHOD||PAPER TITLE||PAPER||CODE||COMPARE|
The dominant paradigm for relation prediction in knowledge graphs involves learning and operating on latent representations (i. e., embeddings) of entities and relations.
Pre-trained language representation models (PLMs) cannot well capture factual knowledge from text.
Despite the importance of inductive link prediction, most previous works focused on transductive link prediction and cannot manage previously unseen entities.
Ranked #5 on Link Prediction on WN18RR
However, the extent to which these representations learned for link prediction generalize to other tasks is unclear.
Ranked #1 on Inductive knowledge graph completion on WN18RR-ind
Logical rules are a popular knowledge representation language in many domains, representing background knowledge and encoding information that can be derived from given facts in a compact form.
Ranked #1 on Inductive logic programming on RuDaS
Many systems have been developed in recent years to mine logical rules from large-scale Knowledge Graphs (KGs), on the grounds that representing regularities as rules enables both the interpretable inference of new facts, and the explanation of known facts.