WCL-BBCD: A Contrastive Learning and Knowledge Graph Approach to Named Entity Recognition

14 Mar 2022  ·  Renjie Zhou, Qiang Hu, Jian Wan, Jilin Zhang, Qiang Liu, Tianxiang Hu, Jianjun Li ·

Named Entity Recognition task is one of the core tasks of information extraction. Word ambiguity and word abbreviation are important reasons for the low recognition rate of named entities. In this paper, we propose a novel named entity recognition model WCL-BBCD (Word Contrastive Learning with BERT-BiLSTM-CRF-DBpedia), which incorporates the idea of contrastive learning. The model first trains the sentence pairs in the text, calculate similarity between sentence pairs, and fine-tunes BERT used for the named entity recognition task according to the similarity, so as to alleviate word ambiguity. Then, the fine-tuned BERT is combined with BiLSTM-CRF to perform the named entity recognition task. Finally, the recognition results are corrected in combination with prior knowledge such as knowledge graphs, so as to alleviate the low-recognition-rate problem caused by word abbreviations. The results of experimentals conducted on the CoNLL-2003 English dataset and OntoNotes V5 English dataset show that our model outperforms other similar models on.

PDF Abstract

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods