Low Resource Named Entity Recognition

14 papers with code • 3 benchmarks • 4 datasets

Low resource named entity recognition is the task of using data and models available for one language for which ample such resources are available (e.g., English) to solve named entity recognition tasks in another, commonly more low-resource, language.

Most implemented papers

Towards Robust Named Entity Recognition for Historic German

stefan-it/historic-ner WS 2019

Recent advances in language modeling using deep neural networks have shown that these models learn representations, that vary with the network depth from morphology to semantic relationships like co-reference.

Massively Multilingual Transfer for NER

afshinrahimi/mmner ACL 2019

In cross-lingual transfer, NLP models over one or more source languages are applied to a low-resource target language.

Feature-Dependent Confusion Matrices for Low-Resource NER Labeling with Noisy Labels

uds-lsv/noise-matrix-ner IJCNLP 2019

In low-resource settings, the performance of supervised labeling models can be improved with automatically annotated or distantly supervised data, which is cheap to create but often noisy.

Zero-Resource Cross-Lingual Named Entity Recognition

ntunlp/Zero-Shot-Cross-Lingual-NER 22 Nov 2019

Recently, neural methods have achieved state-of-the-art (SOTA) results in Named Entity Recognition (NER) tasks for many languages without the need for manually crafted features.

Soft Gazetteers for Low-Resource Named Entity Recognition

neulab/soft-gazetteers ACL 2020

However, designing such features for low-resource languages is challenging, because exhaustive entity gazetteers do not exist in these languages.

A Robust and Domain-Adaptive Approach for Low-Resource Named Entity Recognition

houking-can/RDANER 2 Jan 2021

Recently, it has attracted much attention to build reliable named entity recognition (NER) systems using limited annotated data.

ANEA: Distant Supervision for Low-Resource Named Entity Recognition

uds-lsv/anea 25 Feb 2021

Distant supervision allows obtaining labeled training corpora for low-resource settings where only limited hand-annotated data exists.

Memorisation versus Generalisation in Pre-trained Language Models

Michael-Tanzer/BERT-mem-lowres ACL 2022

State-of-the-art pre-trained language models have been shown to memorise facts and perform well with limited amounts of training data.

Data Augmentation for Low-Resource Named Entity Recognition Using Backtranslation

RussianNLP/TAPE ICON 2021

The state of art natural language processing systems relies on sizable training datasets to achieve high performance.

Low-Resource Named Entity Recognition Based on Multi-hop Dependency Trigger

wjx-git/deptriggerner CCL 2022

This paper presents a simple and effective approach in low-resource named entity recognition (NER) based on multi-hop dependency trigger.