NER

560 papers with code • 6 benchmarks • 24 datasets

The named entity recognition (NER) involves identification of key information in the text and classification into a set of predefined categories. This includes standard entities in the text like Part of Speech (PoS) and entities like places, names etc...

Libraries

Use these libraries to find NER models and implementations

Most implemented papers

NEZHA: Neural Contextualized Representation for Chinese Language Understanding

PaddlePaddle/PaddleNLP 31 Aug 2019

The pre-trained language models have achieved great successes in various natural language understanding (NLU) tasks due to its capacity to capture the deep contextualized information in text by pre-training on large-scale corpora.

CrossNER: Evaluating Cross-Domain Named Entity Recognition

zliucr/CrossNER 8 Dec 2020

Cross-domain named entity recognition (NER) models are able to cope with the scarcity issue of NER samples in target domains.

Fast and Accurate Entity Recognition with Iterated Dilated Convolutions

iesl/dilated-cnn-ner EMNLP 2017

Today when many practitioners run basic NLP on the entire web and large-volume traffic, faster methods are paramount to saving time and energy costs.

Why Attention? Analyze BiLSTM Deficiency and Its Remedies in the Case of NER

jacobvsdanniel/cross-ner 29 Aug 2019

We test the practical impacts of the deficiency on real-world NER datasets, OntoNotes 5. 0 and WNUT 2017, with clear and consistent improvements over the baseline, up to 8. 7% on some of the multi-token entity mentions.

Empower Sequence Labeling with Task-Aware Neural Language Model

LiyuanLucasLiu/LM-LSTM-CRF 13 Sep 2017

In this study, we develop a novel neural framework to extract abundant knowledge hidden in raw texts to empower the sequence labeling task.

Chinese NER Using Lattice LSTM

jiesutd/LatticeLSTM ACL 2018

We investigate a lattice-structured LSTM model for Chinese NER, which encodes a sequence of input characters as well as all potential words that match a lexicon.

CLUENER2020: Fine-grained Named Entity Recognition Dataset and Benchmark for Chinese

CLUEbenchmark/CLUENER2020 13 Jan 2020

In this paper, we introduce the NER dataset from CLUE organization (CLUENER2020), a well-defined fine-grained dataset for named entity recognition in Chinese.

AraBERT: Transformer-based Model for Arabic Language Understanding

aub-mind/araBERT LREC 2020

Recently, with the surge of transformers based models, language-specific BERT based models have proven to be very efficient at language understanding, provided they are pre-trained on a very large corpus.

Parallel sequence tagging for concept recognition

OntoGene/craft-st 16 Mar 2020

In all 20 annotation sets of the concept-annotation task, our system outperforms the pipeline system reported as a baseline in the CRAFT shared task 2019.

Beheshti-NER: Persian Named Entity Recognition Using BERT

sEhsanTaher/Beheshti-NER NSURL 2019

In this paper, we use the pre-trained deep bidirectional network, BERT, to make a model for named entity recognition in Persian.