WCL-BBCD: A Contrastive Learning and Knowledge Graph Approach to Named Entity Recognition

14 Mar 2022  ·  Renjie Zhou, Qiang Hu, Jian Wan, Jilin Zhang, Qiang Liu, Tianxiang Hu, Jianjun Li ·

Named Entity Recognition task is one of the core tasks of information extraction.Word ambiguity and word abbreviation are important reasons for the low recognition rate of named entities. In this paper, we propose a novel named entity recognition model WCL-BBCD (Word Contrastive Learning with BERT-BiLSTM-CRF-DBpedia) incorporating the idea of contrastive learning. The model first trains the sentence pairs in the text, calculate similarity between words in sentence pairs by cosine similarity, and fine-tunes the BERT model used for the named entity recognition task through the similarity, so as to alleviate word ambiguity. Then, the fine-tuned BERT model is combined with the BiLSTM-CRF model to perform the named entity recognition task. Finally, the recognition results are corrected in combination with prior knowledge such as knowledge graphs, so as to alleviate the recognition caused by word abbreviations low-rate problem. Experimental results show that our model outperforms other similar model methods on the CoNLL-2003 English dataset and OntoNotes V5 English dataset.

PDF Abstract

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods