Cross-Lingual NER

11 papers with code • 8 benchmarks • 4 datasets

This task has no description! Would you like to contribute one?

Most implemented papers

ByT5: Towards a token-free future with pre-trained byte-to-byte models

google-research/byt5 28 May 2021

In this paper, we show that a standard Transformer architecture can be used with minimal modifications to process byte sequences.

Beto, Bentz, Becas: The Surprising Cross-Lingual Effectiveness of BERT

shijie-wu/crosslingual-nlp IJCNLP 2019

Pretrained contextual representation models (Peters et al., 2018; Devlin et al., 2018) have pushed forward the state-of-the-art on many NLP tasks.

Cross-lingual Alignment vs Joint Training: A Comparative Study and A Simple Unified Framework

thespectrewithin/joint-align ICLR 2020

Learning multilingual representations of text has proven a successful method for many cross-lingual transfer learning tasks.

Multi-Source Cross-Lingual Model Transfer: Learning What to Share

microsoft/Multilingual-Model-Transfer ACL 2019

In this work, we focus on the multilingual transfer setting where training data in multiple source languages is leveraged to further boost target language performance.

Entity Projection via Machine Translation for Cross-Lingual NER

alankarj/cross_lingual_ner IJCNLP 2019

Although over 100 languages are supported by strong off-the-shelf machine translation systems, only a subset of them possess large annotated corpora for named entity recognition.

Enhanced Meta-Learning for Cross-lingual Named Entity Recognition with Minimal Resources

microsoft/vert-papers 14 Nov 2019

For languages with no annotated resources, transferring knowledge from rich-resource languages is an effective solution for named entity recognition (NER).

Zero-Resource Cross-Lingual Named Entity Recognition

ntunlp/Zero-Shot-Cross-Lingual-NER 22 Nov 2019

Recently, neural methods have achieved state-of-the-art (SOTA) results in Named Entity Recognition (NER) tasks for many languages without the need for manually crafted features.

Single-/Multi-Source Cross-Lingual NER via Teacher-Student Learning on Unlabeled Data in Target Language

microsoft/vert-papers ACL 2020

However, such methods either are not applicable if the labeled data in the source languages is unavailable, or do not leverage information contained in unlabeled data in the target language.

UniTrans: Unifying Model Transfer and Data Transfer for Cross-Lingual Named Entity Recognition with Unlabeled Data

microsoft/vert-papers 15 Jul 2020

Prior works in cross-lingual named entity recognition (NER) with no/little labeled data fall into two primary categories: model transfer based and data transfer based methods.

Semi-Supervised Disentangled Framework for Transferable Named Entity Recognition

DMIRLAB-Group/SSD 22 Dec 2020

In the proposed framework, the domain-specific information is integrated with the domain-specific latent variables by using a domain predictor.