End-to-End Relation Extraction using LSTMs on Sequences and Tree Structures

ACL 2016  ·  Makoto Miwa, Mohit Bansal ·

We present a novel end-to-end neural model to extract entities and relations between them. Our recurrent neural network based model captures both word sequence and dependency tree substructure information by stacking bidirectional tree-structured LSTM-RNNs on bidirectional sequential LSTM-RNNs. This allows our model to jointly represent both entities and relations with shared parameters in a single model. We further encourage detection of entities during training and use of entity information in relation extraction via entity pretraining and scheduled sampling. Our model improves over the state-of-the-art feature-based model on end-to-end relation extraction, achieving 12.1% and 5.7% relative error reductions in F1-score on ACE2005 and ACE2004, respectively. We also show that our LSTM-RNN based model compares favorably to the state-of-the-art CNN based model (in F1-score) on nominal relation classification (SemEval-2010 Task 8). Finally, we present an extensive ablation analysis of several model components.

PDF Abstract ACL 2016 PDF ACL 2016 Abstract

Results from the Paper

 Ranked #1 on Relation Extraction on ACE 2005 (Sentence Encoder metric)

     Get a GitHub badge
Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Relation Extraction ACE 2004 SPTree NER Micro F1 81.8 # 7
RE+ Micro F1 48.4 # 6
Cross Sentence No # 1
Relation Extraction ACE 2005 SPTree NER Micro F1 83.4 # 18
RE+ Micro F1 55.6 # 13
Sentence Encoder biLSTM # 1
Cross Sentence No # 1

Results from Other Papers

Task Dataset Model Metric Name Metric Value Rank Source Paper Compare
Relation Extraction NYT11-HRL SPTree F1 53.1 # 8


No methods listed for this paper. Add relevant methods here