Named Entity Recognition

639 papers with code • 62 benchmarks • 94 datasets

Named entity recognition (NER) is the task of tagging entities in text with their corresponding type. Approaches typically use BIO notation, which differentiates the beginning (B) and the inside (I) of entities. O is used for non-entity tokens.

Example:

Mark Watney visited Mars
B-PER I-PER O B-LOC

( Image credit: Zalando )

Libraries

Use these libraries to find Named Entity Recognition models and implementations
6 papers
11,910
3 papers
2,431
See all 6 libraries.

Most implemented papers

BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding

google-research/bert NAACL 2019

We introduce a new language representation model called BERT, which stands for Bidirectional Encoder Representations from Transformers.

Neural Architectures for Named Entity Recognition

glample/tagger NAACL 2016

State-of-the-art named entity recognition systems rely heavily on hand-crafted features and domain-specific knowledge in order to learn effectively from the small, supervised training corpora that are available.

Deep contextualized word representations

flairNLP/flair NAACL 2018

We introduce a new type of deep contextualized word representation that models both (1) complex characteristics of word use (e. g., syntax and semantics), and (2) how these uses vary across linguistic contexts (i. e., to model polysemy).

End-to-end Sequence Labeling via Bi-directional LSTM-CNNs-CRF

guillaumegenthial/sequence_tagging ACL 2016

State-of-the-art sequence labeling systems traditionally require large amounts of task-specific knowledge in the form of hand-crafted features and data pre-processing.

BioBERT: a pre-trained biomedical language representation model for biomedical text mining

dmis-lab/biobert 25 Jan 2019

Biomedical text mining is becoming increasingly important as the number of biomedical documents rapidly grows.

Named Entity Recognition with Bidirectional LSTM-CNNs

zalandoresearch/flair TACL 2016

Named entity recognition is a challenging task that has traditionally required large amounts of knowledge in the form of feature engineering and lexicons to achieve high performance.

ERNIE: Enhanced Representation through Knowledge Integration

PaddlePaddle/PaddleNLP 19 Apr 2019

We present a novel language representation model enhanced by knowledge called ERNIE (Enhanced Representation through kNowledge IntEgration).

LUKE: Deep Contextualized Entity Representations with Entity-aware Self-attention

studio-ousia/luke EMNLP 2020

In this paper, we propose new pretrained contextualized representations of words and entities based on the bidirectional transformer.

A Unified MRC Framework for Named Entity Recognition

ShannonAI/mrc-for-flat-nested-ner ACL 2020

Instead of treating the task of NER as a sequence labeling problem, we propose to formulate it as a machine reading comprehension (MRC) task.

CamemBERT: a Tasty French Language Model

huggingface/transformers ACL 2020

We show that the use of web crawled data is preferable to the use of Wikipedia data.