About

Chinese named entity recognition is a subtask of information extraction that seeks to locate and classify named entities mentioned in unstructured text into pre-defined categories such as person names, organizations, locations, medical codes, time expressions, quantities, monetary values, percentages, etc. from Chinese text (Source: Adapted from Wikipedia).

Benchmarks

TREND DATASET BEST METHOD PAPER TITLE PAPER CODE COMPARE

Datasets

Greatest papers with code

TENER: Adapting Transformer Encoder for Named Entity Recognition

10 Nov 2019HIT-SCIR/ltp

The Bidirectional long short-term memory networks (BiLSTM) have been widely used as an encoder in models solving the named entity recognition (NER) task.

CHINESE NAMED ENTITY RECOGNITION

Chinese NER Using Lattice LSTM

ACL 2018 jiesutd/LatticeLSTM

We investigate a lattice-structured LSTM model for Chinese NER, which encodes a sequence of input characters as well as all potential words that match a lexicon.

CHINESE NAMED ENTITY RECOGNITION

CLUENER2020: Fine-grained Named Entity Recognition Dataset and Benchmark for Chinese

13 Jan 2020CLUEbenchmark/CLUENER2020

In this paper, we introduce the NER dataset from CLUE organization (CLUENER2020), a well-defined fine-grained dataset for named entity recognition in Chinese.

CHINESE NAMED ENTITY RECOGNITION

ZEN: Pre-training Chinese Text Encoder Enhanced by N-gram Representations

2 Nov 2019sinovation/ZEN

Moreover, it is shown that reasonable performance can be obtained when ZEN is trained on a small corpus, which is important for applying pre-training techniques to scenarios with limited data.

CHINESE NAMED ENTITY RECOGNITION CHINESE WORD SEGMENTATION DOCUMENT CLASSIFICATION NATURAL LANGUAGE INFERENCE PART-OF-SPEECH TAGGING SENTENCE PAIR MODELING SENTIMENT ANALYSIS

FLAT: Chinese NER Using Flat-Lattice Transformer

ACL 2020 LeeSureman/Flat-Lattice-Transformer

Recently, the character-word lattice structure has been proved to be effective for Chinese named entity recognition (NER) by incorporating the word information.

CHINESE NAMED ENTITY RECOGNITION

Simplify the Usage of Lexicon in Chinese NER

ACL 2020 v-mipeng/LexiconAugmentedNER

This method avoids designing a complicated sequence modeling architecture, and for any neural NER model, it requires only subtle adjustment of the character representation layer to introduce the lexicon information.

CHINESE NAMED ENTITY RECOGNITION