Graph Convolution over Pruned Dependency Trees Improves Relation Extraction

EMNLP 2018  ·  Yuhao Zhang, Peng Qi, Christopher D. Manning ·

Dependency trees help relation extraction models capture long-range relations between words. However, existing dependency-based models either neglect crucial information (e.g., negation) by pruning the dependency trees too aggressively, or are computationally inefficient because it is difficult to parallelize over different tree structures. We propose an extension of graph convolutional networks that is tailored for relation extraction, which pools information over arbitrary dependency structures efficiently in parallel. To incorporate relevant information while maximally removing irrelevant content, we further apply a novel pruning strategy to the input trees by keeping words immediately around the shortest path between the two entities among which a relation might hold. The resulting model achieves state-of-the-art performance on the large-scale TACRED dataset, outperforming existing sequence and dependency-based neural models. We also show through detailed analysis that this model has complementary strengths to sequence models, and combining them further improves the state of the art.

PDF Abstract EMNLP 2018 PDF EMNLP 2018 Abstract

Datasets


Results from the Paper


Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Relation Extraction Re-TACRED C-GCN F1 80.3 # 6
Relation Classification TACRED C-GCN F1 66.4 # 5
Relation Extraction TACRED GCN + PA-LSTM F1 67.1 # 32
Relation Extraction TACRED GCN F1 64.0 # 37
Relation Extraction TACRED C-GCN + PA-LSTM F1 68.2 # 25
Relation Extraction TACRED C-GCN F1 66.4 # 34

Methods