Zero-Shot Cross-Lingual Transfer
71 papers with code • 2 benchmarks • 4 datasets
Libraries
Use these libraries to find Zero-Shot Cross-Lingual Transfer models and implementationsMost implemented papers
On the Limitations of Cross-lingual Encoders as Exposed by Reference-Free Machine Translation Evaluation
We systematically investigate a range of metrics based on state-of-the-art cross-lingual semantic representations obtained with pretrained M-BERT and LASER.
Finding Universal Grammatical Relations in Multilingual BERT
Recent work has found evidence that Multilingual BERT (mBERT), a transformer-based multilingual masked language model, is capable of zero-shot cross-lingual transfer, suggesting that some aspects of its representations are shared cross-lingually.
Inducing Language-Agnostic Multilingual Representations
Cross-lingual representations have the potential to make NLP techniques available to the vast majority of languages in the world.
FILTER: An Enhanced Fusion Method for Cross-lingual Language Understanding
During inference, the model makes predictions based on the text input in the target language and its translation in the source language.
The Multilingual Amazon Reviews Corpus
We present the Multilingual Amazon Reviews Corpus (MARC), a large-scale collection of Amazon reviews for multilingual text classification.
XL-WiC: A Multilingual Benchmark for Evaluating Semantic Contextualization
The ability to correctly model distinct meanings of a word is crucial for the effectiveness of semantic representation techniques.
Model Selection for Cross-Lingual Transfer
Transformers that are pre-trained on multilingual corpora, such as, mBERT and XLM-RoBERTa, have achieved impressive cross-lingual transfer capabilities.
First Align, then Predict: Understanding the Cross-Lingual Ability of Multilingual BERT
Such transfer emerges by fine-tuning on a task of interest in one language and evaluating on a distinct language, not seen during the fine-tuning.
Beyond the English Web: Zero-Shot Cross-Lingual and Lightweight Monolingual Classification of Registers
We explore cross-lingual transfer of register classification for web documents.
Zero-Shot Cross-Lingual Dependency Parsing through Contextual Embedding Transformation
Linear embedding transformation has been shown to be effective for zero-shot cross-lingual transfer tasks and achieve surprisingly promising results.