|TREND||DATASET||BEST METHOD||PAPER TITLE||PAPER||CODE||COMPARE|
We release an open toolkit for knowledge embedding (OpenKE), which provides a unified framework and various fundamental models to embed knowledge graphs into a continuous low-dimensional space.
In statistical relational learning, knowledge graph completion deals with automatically understanding the structure of large knowledge graphs---labeled directed graphs---and predicting missing relationships---labeled edges.
SOTA for Knowledge Graphs on FB15k
We demonstrate the effectiveness of R-GCNs as a stand-alone model for entity classification.
SOTA for Node Classification on AIFB
Neural language representation models such as BERT pre-trained on large-scale corpora can well capture rich semantic patterns from plain text, and be fine-tuned to consistently improve the performance of various NLP tasks.
SOTA for Relation Extraction on FewRel
Given a learned knowledge graph (KG), our approach takes as input semantic embeddings for each node (representing visual category).
Recently, a variety of methods have been developed for this problem, which generally try to learn effective representations of users and items and then match items to users according to their representations.
We study the problem of learning representations of entities and relations in knowledge graphs for predicting missing links.
#4 best model for Link Prediction on WN18
In this work, we introduce ConvE, a multi-layer convolutional network model for link prediction, and report state-of-the-art results for several established datasets.
SOTA for Link Prediction on WN18