|Trend||Dataset||Best Method||Paper title||Paper||Code||Compare|
We present a novel extension to embedding-based knowledge graph completion models which enables them to perform open-world link prediction, i. e. to predict facts for entities unseen in training based on their textual description.
In this paper, we focus on measures that leverage structural properties of the knowledge hierarchy graph to assess the temporal changes.
The recent proliferation of knowledge graphs (KGs) coupled with incomplete or partial information, in the form of missing relations (links) between entities, has fueled a lot of research on knowledge base completion (also known as relation prediction).
In this work, we move beyond the traditional complex-valued representations, introducing more expressive hypercomplex representations to model entities and relations for knowledge graph embeddings.
In this paper, we jointly learn the model of recommendation and knowledge graph completion.
This limitation is expected to become more stringent as existing knowledge graphs, which are already huge, keep steadily growing in scale.
Knowledge graph (KG) completion aims to fill the missing facts in a KG, where a fact is represented as a triple in the form of $(subject, relation, object)$.
In line with previous work on static knowledge graphs, we propose to address this problem by learning latent entity and relation type representations.