Cross-Lingual Transfer

289 papers with code • 1 benchmarks • 16 datasets

Cross-lingual transfer refers to transfer learning using data and models available for one language for which ample such resources are available (e.g., English) to solve tasks in another, commonly more low-resource, language.

Libraries

Use these libraries to find Cross-Lingual Transfer models and implementations
2 papers
393
2 papers
128

Most implemented papers

MAD-X: An Adapter-Based Framework for Multi-Task Cross-Lingual Transfer

cambridgeltl/xcopa EMNLP 2020

The main goal behind state-of-the-art pre-trained multilingual models such as multilingual BERT and XLM-R is enabling and bootstrapping NLP applications in low-resource languages through zero-shot or few-shot cross-lingual transfer.

Word Alignment by Fine-tuning Embeddings on Parallel Corpora

neulab/awesome-align EACL 2021

In addition, we demonstrate that we are able to train multilingual word aligners that can obtain robust performance on different language pairs.

Visually Grounded Reasoning across Languages and Cultures

e-bug/volta EMNLP 2021

The design of widespread vision-and-language datasets and pre-trained encoders directly adopts, or draws inspiration from, the concepts and images of ImageNet.

Adversarial Deep Averaging Networks for Cross-Lingual Sentiment Classification

ccsasuke/adan TACL 2018

To tackle the sentiment classification problem in low-resource languages without adequate annotated data, we propose an Adversarial Deep Averaging Network (ADAN) to transfer the knowledge learned from labeled data on a resource-rich source language to low-resource languages where only unlabeled data exists.

Beto, Bentz, Becas: The Surprising Cross-Lingual Effectiveness of BERT

shijie-wu/crosslingual-nlp IJCNLP 2019

Pretrained contextual representation models (Peters et al., 2018; Devlin et al., 2018) have pushed forward the state-of-the-art on many NLP tasks.

Cross-lingual Alignment vs Joint Training: A Comparative Study and A Simple Unified Framework

thespectrewithin/joint-align ICLR 2020

Learning multilingual representations of text has proven a successful method for many cross-lingual transfer learning tasks.

A Common Semantic Space for Monolingual and Cross-Lingual Meta-Embeddings

ikergarcia1996/MVM-Embeddings 17 Jan 2020

This paper presents a new technique for creating monolingual and cross-lingual meta-embeddings.

Cross-lingual Emotion Intensity Prediction

jbarnesspain/fine-grained_cross-lingual_emotion COLING (PEOPLES) 2020

Consequently, we explore cross-lingual transfer approaches for fine-grained emotion detection in Spanish and Catalan tweets.

Cross-Cultural Similarity Features for Cross-Lingual Transfer Learning of Pragmatically Motivated Tasks

hwijeen/langrank EACL 2021

Much work in cross-lingual transfer learning explored how to select better transfer languages for multilingual tasks, primarily focusing on typological and genealogical similarities between languages.