Named Entity Recognition

918 papers with code • 6 benchmarks • 13 datasets

This task has no description! Would you like to contribute one?

Libraries

Use these libraries to find Named Entity Recognition models and implementations
6 papers
13,981
4 papers
423
See all 6 libraries.

Most implemented papers

BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding

google-research/bert NAACL 2019

We introduce a new language representation model called BERT, which stands for Bidirectional Encoder Representations from Transformers.

Neural Architectures for Named Entity Recognition

glample/tagger NAACL 2016

State-of-the-art named entity recognition systems rely heavily on hand-crafted features and domain-specific knowledge in order to learn effectively from the small, supervised training corpora that are available.

End-to-end Sequence Labeling via Bi-directional LSTM-CNNs-CRF

guillaumegenthial/sequence_tagging ACL 2016

State-of-the-art sequence labeling systems traditionally require large amounts of task-specific knowledge in the form of hand-crafted features and data pre-processing.

BioBERT: a pre-trained biomedical language representation model for biomedical text mining

dmis-lab/biobert 25 Jan 2019

Biomedical text mining is becoming increasingly important as the number of biomedical documents rapidly grows.

ERNIE: Enhanced Representation through Knowledge Integration

PaddlePaddle/PaddleNLP 19 Apr 2019

We present a novel language representation model enhanced by knowledge called ERNIE (Enhanced Representation through kNowledge IntEgration).

Named Entity Recognition with Bidirectional LSTM-CNNs

flairNLP/flair TACL 2016

Named entity recognition is a challenging task that has traditionally required large amounts of knowledge in the form of feature engineering and lexicons to achieve high performance.

NEZHA: Neural Contextualized Representation for Chinese Language Understanding

PaddlePaddle/PaddleNLP 31 Aug 2019

The pre-trained language models have achieved great successes in various natural language understanding (NLU) tasks due to its capacity to capture the deep contextualized information in text by pre-training on large-scale corpora.

LUKE: Deep Contextualized Entity Representations with Entity-aware Self-attention

studio-ousia/luke EMNLP 2020

In this paper, we propose new pretrained contextualized representations of words and entities based on the bidirectional transformer.

A Unified MRC Framework for Named Entity Recognition

ShannonAI/mrc-for-flat-nested-ner ACL 2020

Instead of treating the task of NER as a sequence labeling problem, we propose to formulate it as a machine reading comprehension (MRC) task.

CamemBERT: a Tasty French Language Model

huggingface/transformers ACL 2020

We show that the use of web crawled data is preferable to the use of Wikipedia data.