Structure-Augmented Text Representation Learning for Efficient Knowledge Graph Completion

30 Apr 2020 Bo Wang Tao Shen Guodong Long Tianyi Zhou Yi Chang

Human-curated knowledge graphs provide critical supportive information to various natural language processing tasks, but these graphs are usually incomplete, urging auto-completion of them. Prevalent graph embedding approaches, e.g., TransE, learn structured knowledge via representing graph elements into dense embeddings and capturing their triple-level relationship with spatial distance... (read more)

PDF Abstract

Results from the Paper


 Ranked #1 on Link Prediction on WN18RR (using extra training data)

     Get a GitHub badge
TASK DATASET MODEL METRIC NAME METRIC VALUE GLOBAL RANK USES EXTRA
TRAINING DATA
RESULT BENCHMARK
Link Prediction FB15k-237 StAR(Self-Adp) MRR 0.365 # 9
Hits@10 0.562 # 6
Hits@3 0.404 # 4
Hits@1 0.266 # 9
MR 117 # 2
Link Prediction WN18RR StAR MRR 0.401 # 37
Hits@10 0.709 # 1
Hits@3 0.491 # 15
Hits@1 0.243 # 32
MR 51 # 1

Methods used in the Paper


METHOD TYPE
TransE
Graph Embeddings
Residual Connection
Skip Connections
BPE
Subword Segmentation
Dense Connections
Feedforward Networks
Label Smoothing
Regularization
ReLU
Activation Functions
Adam
Stochastic Optimization
Softmax
Output Functions
Dropout
Regularization
Multi-Head Attention
Attention Modules
Layer Normalization
Normalization
Scaled Dot-Product Attention
Attention Mechanisms
Transformer
Transformers