Cross-Lingual Transfer
289 papers with code • 1 benchmarks • 16 datasets
Cross-lingual transfer refers to transfer learning using data and models available for one language for which ample such resources are available (e.g., English) to solve tasks in another, commonly more low-resource, language.
Libraries
Use these libraries to find Cross-Lingual Transfer models and implementationsMost implemented papers
MAD-X: An Adapter-Based Framework for Multi-Task Cross-Lingual Transfer
The main goal behind state-of-the-art pre-trained multilingual models such as multilingual BERT and XLM-R is enabling and bootstrapping NLP applications in low-resource languages through zero-shot or few-shot cross-lingual transfer.
Word Alignment by Fine-tuning Embeddings on Parallel Corpora
In addition, we demonstrate that we are able to train multilingual word aligners that can obtain robust performance on different language pairs.
Visually Grounded Reasoning across Languages and Cultures
The design of widespread vision-and-language datasets and pre-trained encoders directly adopts, or draws inspiration from, the concepts and images of ImageNet.
Adversarial Deep Averaging Networks for Cross-Lingual Sentiment Classification
To tackle the sentiment classification problem in low-resource languages without adequate annotated data, we propose an Adversarial Deep Averaging Network (ADAN) to transfer the knowledge learned from labeled data on a resource-rich source language to low-resource languages where only unlabeled data exists.
On Difficulties of Cross-Lingual Transfer with Order Differences: A Case Study on Dependency Parsing
Different languages might have different word orders.
Beto, Bentz, Becas: The Surprising Cross-Lingual Effectiveness of BERT
Pretrained contextual representation models (Peters et al., 2018; Devlin et al., 2018) have pushed forward the state-of-the-art on many NLP tasks.
Cross-lingual Alignment vs Joint Training: A Comparative Study and A Simple Unified Framework
Learning multilingual representations of text has proven a successful method for many cross-lingual transfer learning tasks.
A Common Semantic Space for Monolingual and Cross-Lingual Meta-Embeddings
This paper presents a new technique for creating monolingual and cross-lingual meta-embeddings.
Cross-lingual Emotion Intensity Prediction
Consequently, we explore cross-lingual transfer approaches for fine-grained emotion detection in Spanish and Catalan tweets.
Cross-Cultural Similarity Features for Cross-Lingual Transfer Learning of Pragmatically Motivated Tasks
Much work in cross-lingual transfer learning explored how to select better transfer languages for multilingual tasks, primarily focusing on typological and genealogical similarities between languages.