Low Resource Named Entity Recognition

8 papers with code • 3 benchmarks • 4 datasets

Low resource named entity recognition is the task of using data and models available for one language for which ample such resources are available (e.g., English) to solve named entity recognition tasks in another, commonly more low-resource, language.

Most implemented papers

Towards Robust Named Entity Recognition for Historic German

stefan-it/historic-ner WS 2019

Recent advances in language modeling using deep neural networks have shown that these models learn representations, that vary with the network depth from morphology to semantic relationships like co-reference.

Massively Multilingual Transfer for NER

afshinrahimi/mmner ACL 2019

In cross-lingual transfer, NLP models over one or more source languages are applied to a low-resource target language.

Feature-Dependent Confusion Matrices for Low-Resource NER Labeling with Noisy Labels

uds-lsv/noise-matrix-ner IJCNLP 2019

In low-resource settings, the performance of supervised labeling models can be improved with automatically annotated or distantly supervised data, which is cheap to create but often noisy.

Zero-Resource Cross-Lingual Named Entity Recognition

ntunlp/Zero-Shot-Cross-Lingual-NER 22 Nov 2019

Recently, neural methods have achieved state-of-the-art (SOTA) results in Named Entity Recognition (NER) tasks for many languages without the need for manually crafted features.

Soft Gazetteers for Low-Resource Named Entity Recognition

neulab/soft-gazetteers ACL 2020

However, designing such features for low-resource languages is challenging, because exhaustive entity gazetteers do not exist in these languages.

A Robust and Domain-Adaptive Approach for Low-Resource Named Entity Recognition

houking-can/RDANER 2 Jan 2021

Recently, it has attracted much attention to build reliable named entity recognition (NER) systems using limited annotated data.

ANEA: Distant Supervision for Low-Resource Named Entity Recognition

uds-lsv/anea 25 Feb 2021

Distant supervision allows obtaining labeled training corpora for low-resource settings where only limited hand-annotated data exists.

A Comparative Study of Pre-trained Encoders for Low-Resource Named Entity Recognition

dfki-nlp/fewie RepL4NLP (ACL) 2022

Pre-trained language models (PLM) are effective components of few-shot named entity recognition (NER) approaches when augmented with continued pre-training on task-specific out-of-domain data or fine-tuning on in-domain data.