Chinese named entity recognition is a subtask of information extraction that seeks to locate and classify named entities mentioned in unstructured text into pre-defined categories such as person names, organizations, locations, medical codes, time expressions, quantities, monetary values, percentages, etc. from Chinese text (Source: Adapted from Wikipedia).
|TREND||DATASET||BEST METHOD||PAPER TITLE||PAPER||CODE||COMPARE|
Recently, pre-trained models have achieved state-of-the-art results in various language understanding tasks, which indicates that pre-training on large-scale corpora may play a crucial role in natural language processing.
Ranked #1 on Open-Domain Question Answering on DuReader
CHINESE NAMED ENTITY RECOGNITION CHINESE READING COMPREHENSION CHINESE SENTENCE PAIR CLASSIFICATION CHINESE SENTIMENT ANALYSIS LINGUISTIC ACCEPTABILITY MULTI-TASK LEARNING NATURAL LANGUAGE INFERENCE OPEN-DOMAIN QUESTION ANSWERING SEMANTIC TEXTUAL SIMILARITY SENTIMENT ANALYSIS
We present a novel language representation model enhanced by knowledge called ERNIE (Enhanced Representation through kNowledge IntEgration).
Ranked #2 on Chinese Sentence Pair Classification on LCQMC Dev
CHINESE NAMED ENTITY RECOGNITION CHINESE SENTENCE PAIR CLASSIFICATION CHINESE SENTIMENT ANALYSIS NATURAL LANGUAGE INFERENCE QUESTION ANSWERING SEMANTIC SIMILARITY SEMANTIC TEXTUAL SIMILARITY SENTIMENT ANALYSIS
The Bidirectional long short-term memory networks (BiLSTM) have been widely used as an encoder in models solving the named entity recognition (NER) task.
Ranked #8 on Chinese Named Entity Recognition on Resume NER
We investigate a lattice-structured LSTM model for Chinese NER, which encodes a sequence of input characters as well as all potential words that match a lexicon.
Ranked #9 on Chinese Named Entity Recognition on Weibo NER
In this paper, we introduce the NER dataset from CLUE organization (CLUENER2020), a well-defined fine-grained dataset for named entity recognition in Chinese.
Moreover, it is shown that reasonable performance can be obtained when ZEN is trained on a small corpus, which is important for applying pre-training techniques to scenarios with limited data.
Ranked #1 on Chinese Part-of-Speech Tagging on CTB5
Recently, the character-word lattice structure has been proved to be effective for Chinese named entity recognition (NER) by incorporating the word information.
Ranked #2 on Chinese Named Entity Recognition on MSRA
However, due to the lack of rich pictographic evidence in glyphs and the weak generalization ability of standard computer vision models on character data, an effective way to utilize the glyph information remains to be found.
Ranked #1 on Chinese Sentence Pair Classification on XNLI (Accuracy metric)
CHINESE DEPENDENCY PARSING CHINESE NAMED ENTITY RECOGNITION CHINESE PART-OF-SPEECH TAGGING CHINESE SEMANTIC ROLE LABELING CHINESE SENTENCE PAIR CLASSIFICATION CHINESE WORD SEGMENTATION CLASSIFICATION DEPENDENCY PARSING DOCUMENT CLASSIFICATION IMAGE CLASSIFICATION LANGUAGE MODELLING MACHINE TRANSLATION MULTI-TASK LEARNING PART-OF-SPEECH TAGGING SEMANTIC ROLE LABELING SEMANTIC TEXTUAL SIMILARITY SENTENCE CLASSIFICATION SENTIMENT ANALYSIS
This method avoids designing a complicated sequence modeling architecture, and for any neural NER model, it requires only subtle adjustment of the character representation layer to introduce the lexicon information.
Ranked #6 on Chinese Named Entity Recognition on Resume NER
However, existing methods for Chinese NER either do not exploit word boundary information from CWS or cannot filter the specific information of CWS.
Ranked #1 on Chinese Named Entity Recognition on SighanNER