Better Feature Integration for Named Entity Recognition

NAACL 2021  ·  Lu Xu, Zhanming Jie, Wei Lu, Lidong Bing ·

It has been shown that named entity recognition (NER) could benefit from incorporating the long-distance structured information captured by dependency trees. We believe this is because both types of features - the contextual information captured by the linear sequences and the structured information captured by the dependency trees may complement each other. However, existing approaches largely focused on stacking the LSTM and graph neural networks such as graph convolutional networks (GCNs) for building improved NER models, where the exact interaction mechanism between the two types of features is not very clear, and the performance gain does not appear to be significant. In this work, we propose a simple and robust solution to incorporate both types of features with our Synergized-LSTM (Syn-LSTM), which clearly captures how the two types of features interact. We conduct extensive experiments on several standard datasets across four languages. The results demonstrate that the proposed model achieves better performance than previous approaches while requiring fewer parameters. Our further analysis demonstrates that our model can capture longer dependencies compared with strong baselines.

PDF Abstract NAACL 2021 PDF NAACL 2021 Abstract

Datasets


Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Named Entity Recognition (NER) Ontonotes v5 (English) Syn-LSTM + BERT (wo doc-context) F1 90.85 # 8
Named Entity Recognition (NER) Ontonotes v5 (English) Syn-LSTM (wo doc-context) F1 89.04 # 16

Methods