A Neural Transition-based Model for Nested Mention Recognition

EMNLP 2018  ·  Bailin Wang, Wei Lu, Yu Wang, Hongxia Jin ·

It is common that entity mentions can contain other mentions recursively. This paper introduces a scalable transition-based method to model the nested structure of mentions. We first map a sentence with nested mentions to a designated forest where each mention corresponds to a constituent of the forest. Our shift-reduce based system then learns to construct the forest structure in a bottom-up manner through an action sequence whose maximal length is guaranteed to be three times of the sentence length. Based on Stack-LSTM which is employed to efficiently and effectively represent the states of the system in a continuous space, our system is further incorporated with a character-based component to capture letter-level patterns. Our model achieves the state-of-the-art results on ACE datasets, showing its effectiveness in detecting nested mentions.

PDF Abstract EMNLP 2018 PDF EMNLP 2018 Abstract

Results from the Paper


Task Dataset Model Metric Name Metric Value Global Rank Benchmark
Nested Mention Recognition ACE 2004 Neural transition-based model F1 73.1 # 7
Nested Named Entity Recognition ACE 2004 Neural transition-based model F1 73.3 # 23
Named Entity Recognition (NER) ACE 2004 Neural transition-based model F1 73.3 # 9
Multi-Task Supervision n # 1
Nested Named Entity Recognition ACE 2005 neural transition-based model F1 73.0 # 23
Nested Mention Recognition ACE 2005 Neural transition-based model F1 73.0 # 9
Named Entity Recognition (NER) ACE 2005 Neural transition-based model F1 73.0 # 19
Named Entity Recognition (NER) GENIA Neural transition-based model F1 73.9 # 13
Nested Named Entity Recognition GENIA Neural transition-based model F1 73.9 # 25

Results from Other Papers


Task Dataset Model Metric Name Metric Value Rank Source Paper Compare
Nested Named Entity Recognition NNE Neural Transition-based Model Micro F1 73.6 # 6

Methods


No methods listed for this paper. Add relevant methods here