Cross-Lingual Transfer

291 papers with code • 1 benchmarks • 16 datasets

Cross-lingual transfer refers to transfer learning using data and models available for one language for which ample such resources are available (e.g., English) to solve tasks in another, commonly more low-resource, language.

Libraries

Use these libraries to find Cross-Lingual Transfer models and implementations
2 papers
400
2 papers
128

Most implemented papers

Multilingual Evidence Retrieval and Fact Verification to Combat Global Disinformation: The Power of Polyglotism

D-Roberts/multilingual_nli_ECIR2021 16 Dec 2020

This article investigates multilingual evidence retrieval and fact verification as a step to combat global disinformation, a first effort of this kind, to the best of our knowledge.

UNKs Everywhere: Adapting Multilingual Language Models to New Scripts

adapter-hub/unks_everywhere EMNLP 2021

The ultimate challenge is dealing with under-resourced languages not covered at all by the models and written in scripts unseen during pretraining.

MetaXL: Meta Representation Transformation for Low-resource Cross-lingual Learning

microsoft/MetaXL NAACL 2021

Extensive experiments on real-world low-resource languages - without access to large-scale monolingual corpora or large amounts of labeled data - for tasks like cross-lingual sentiment analysis and named entity recognition show the effectiveness of our approach.

A cost-benefit analysis of cross-lingual transfer methods

unicamp-dl/cross-lingual-analysis 14 May 2021

An effective method for cross-lingual transfer is to fine-tune a bilingual or multilingual model on a supervised dataset in one language and evaluating it on another language in a zero-shot manner.

Simple and Effective Zero-shot Cross-lingual Phoneme Recognition

facebookresearch/fairseq 23 Sep 2021

Recent progress in self-training, self-supervised pretraining and unsupervised learning enabled well performing speech recognition systems without any labeled data.

Call Larisa Ivanovna: Code-Switching Fools Multilingual NLU Models

PragmaticsLab/CodeSwitchingAdversarial 29 Sep 2021

This is in line with the common understanding of how multilingual models conduct transferring between languages

K-Wav2vec 2.0: Automatic Speech Recognition based on Joint Decoding of Graphemes and Syllables

joungheekim/k-wav2vec 11 Oct 2021

Wav2vec 2. 0 is an end-to-end framework of self-supervised learning for speech representation that is successful in automatic speech recognition (ASR), but most of the work on the topic has been developed with a single language: English.

mLUKE: The Power of Entity Representations in Multilingual Pretrained Language Models

studio-ousia/luke ACL 2022

We train a multilingual language model with 24 languages with entity representations and show the model consistently outperforms word-based pretrained models in various cross-lingual transfer tasks.

When is BERT Multilingual? Isolating Crucial Ingredients for Cross-lingual Transfer

princeton-nlp/MultilingualAnalysis NAACL 2022

While recent work on multilingual language models has demonstrated their capacity for cross-lingual zero-shot transfer on downstream tasks, there is a lack of consensus in the community as to what shared properties between languages enable such transfer.