Zero-Shot Cross-Lingual Transfer

38 papers with code • 2 benchmarks • 4 datasets

This task has no description! Would you like to contribute one?

Libraries

Use these libraries to find Zero-Shot Cross-Lingual Transfer models and implementations
2 papers
110

Most implemented papers

Massively Multilingual Sentence Embeddings for Zero-Shot Cross-Lingual Transfer and Beyond

facebookresearch/LASER TACL 2019

We introduce an architecture to learn joint multilingual sentence representations for 93 languages, belonging to more than 30 different families and written in 28 different scripts.

Beto, Bentz, Becas: The Surprising Cross-Lingual Effectiveness of BERT

shijie-wu/crosslingual-nlp IJCNLP 2019

Pretrained contextual representation models (Peters et al., 2018; Devlin et al., 2018) have pushed forward the state-of-the-art on many NLP tasks.

XLM-E: Cross-lingual Language Model Pre-training via ELECTRA

microsoft/unilm ACL 2022

In this paper, we introduce ELECTRA-style tasks to cross-lingual language model pre-training.

Simple and Effective Zero-shot Cross-lingual Phoneme Recognition

facebookresearch/fairseq 23 Sep 2021

Recent progress in self-training, self-supervised pretraining and unsupervised learning enabled well performing speech recognition systems without any labeled data.

Visually Grounded Reasoning across Languages and Cultures

e-bug/volta EMNLP 2021

The design of widespread vision-and-language datasets and pre-trained encoders directly adopts, or draws inspiration from, the concepts and images of ImageNet.

Adversarial Propagation and Zero-Shot Cross-Lingual Transfer of Word Vector Specialization

cambridgeltl/adversarial-postspec EMNLP 2018

Our adversarial post-specialization method propagates the external lexical knowledge to the full distributional space.

Cross-Lingual BERT Transformation for Zero-Shot Dependency Parsing

WangYuxuan93/CLBT IJCNLP 2019

In this approach, a linear transformation is learned from contextual word alignments to align the contextualized embeddings independently trained in different languages.

Cross-Lingual Natural Language Generation via Pre-Training

CZWin32768/xnlg 23 Sep 2019

In this work we focus on transferring supervision signals of natural language generation (NLG) tasks between multiple languages.

Parameter Space Factorization for Zero-Shot Learning across Tasks and Languages

cambridgeltl/parameter-factorization 30 Jan 2020

In this work, we propose a Bayesian generative model for the space of neural parameters.

Zero-Shot Cross-Lingual Transfer with Meta Learning

copenlu/X-MAML EMNLP 2020

We show that this challenging setup can be approached using meta-learning, where, in addition to training a source language model, another model learns to select which training instances are the most beneficial to the first.