Hero-Gang Neural Model For Named Entity Recognition

Named entity recognition (NER) is a fundamental and important task in NLP, aiming at identifying named entities (NEs) from free text. Recently, since the multi-head attention mechanism applied in the Transformer model can effectively capture longer contextual information, Transformer-based models have become the mainstream methods and have achieved significant performance in this task. Unfortunately, although these models can capture effective global context information, they are still limited in the local feature and position information extraction, which is critical in NER. In this paper, to address this limitation, we propose a novel Hero-Gang Neural structure (HGN), including the Hero and Gang module, to leverage both global and local information to promote NER. Specifically, the Hero module is composed of a Transformer-based encoder to maintain the advantage of the self-attention mechanism, and the Gang module utilizes a multi-window recurrent module to extract local features and position information under the guidance of the Hero module. Afterward, the proposed multi-window attention effectively combines global information and multiple local features for predicting entity labels. Experimental results on several benchmark datasets demonstrate the effectiveness of our proposed model.

PDF Abstract NAACL 2022 PDF NAACL 2022 Abstract
Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Named Entity Recognition (NER) BC2GM HGN F1 85.65 # 5
Named Entity Recognition (NER) BC5CDR-chemical HGN F1 94.59 # 3
Named Entity Recognition (NER) BC5CDR-disease HGN F1 87.86 # 2
Named Entity Recognition (NER) OntoNotes 5.0 HGN Average F1 90.92 # 1
Named Entity Recognition (NER) Ontonotes v5 (English) HGN F1 90.92 # 7
Named Entity Recognition (NER) WNUT 2016 HGN F1 59.50 # 1
Named Entity Recognition (NER) WNUT 2017 HGN F1 57.41 # 5