Going out on a limb: Joint Extraction of Entity Mentions and Relations without Dependency Trees

ACL 2017  ·  Arzoo Katiyar, Claire Cardie ·

We present a novel attention-based recurrent neural network for joint extraction of entity mentions and relations. We show that attention along with long short term memory (LSTM) network can extract semantic relations between entity mentions without having access to dependency trees. Experiments on Automatic Content Extraction (ACE) corpora show that our model significantly outperforms feature-based joint model by Li and Ji (2014). We also compare our model with an end-to-end tree-based LSTM model (SPTree) by Miwa and Bansal (2016) and show that our model performs within 1{\%} on entity mentions and 2{\%} on relations. Our fine-grained analysis also shows that our model performs significantly better on Agent-Artifact relations, while SPTree performs better on Physical and Part-Whole relations.

PDF Abstract
No code implementations yet. Submit your code now

Results from the Paper


Task Dataset Model Metric Name Metric Value Global Rank Benchmark
Relation Extraction ACE 2004 Attention NER Micro F1 79.6 # 11
RE+ Micro F1 45.7 # 9
Cross Sentence No # 1
Relation Extraction ACE 2005 Attention RE Micro F1 55.9 # 11
NER Micro F1 82.6 # 19
RE+ Micro F1 53.6 # 14
Sentence Encoder biLSTM # 1
Cross Sentence No # 1

Methods