Zero-Shot Cross-Lingual Transfer

71 papers with code • 2 benchmarks • 4 datasets

This task has no description! Would you like to contribute one?

Libraries

Use these libraries to find Zero-Shot Cross-Lingual Transfer models and implementations
2 papers
128

Most implemented papers

On the Limitations of Cross-lingual Encoders as Exposed by Reference-Free Machine Translation Evaluation

AIPHES/ACL20-Reference-Free-MT-Evaluation ACL 2020

We systematically investigate a range of metrics based on state-of-the-art cross-lingual semantic representations obtained with pretrained M-BERT and LASER.

Finding Universal Grammatical Relations in Multilingual BERT

ethanachi/multilingual-probing-visualization ACL 2020

Recent work has found evidence that Multilingual BERT (mBERT), a transformer-based multilingual masked language model, is capable of zero-shot cross-lingual transfer, suggesting that some aspects of its representations are shared cross-lingually.

Inducing Language-Agnostic Multilingual Representations

AIPHES/Language-Agnostic-Contextualized-Encoders Joint Conference on Lexical and Computational Semantics 2021

Cross-lingual representations have the potential to make NLP techniques available to the vast majority of languages in the world.

FILTER: An Enhanced Fusion Method for Cross-lingual Language Understanding

yuwfan/FILTER 10 Sep 2020

During inference, the model makes predictions based on the text input in the target language and its translation in the source language.

The Multilingual Amazon Reviews Corpus

mojave-pku/uniprompt EMNLP 2020

We present the Multilingual Amazon Reviews Corpus (MARC), a large-scale collection of Amazon reviews for multilingual text classification.

XL-WiC: A Multilingual Benchmark for Evaluating Semantic Contextualization

pasinit/xlwic-runs EMNLP 2020

The ability to correctly model distinct meanings of a word is crucial for the effectiveness of semantic representation techniques.

Model Selection for Cross-Lingual Transfer

edchengg/model_selection EMNLP 2021

Transformers that are pre-trained on multilingual corpora, such as, mBERT and XLM-RoBERTa, have achieved impressive cross-lingual transfer capabilities.

First Align, then Predict: Understanding the Cross-Lingual Ability of Multilingual BERT

benjamin-mlr/first-align-then-predict EACL 2021

Such transfer emerges by fine-tuning on a task of interest in one language and evaluating on a distinct language, not seen during the fine-tuning.

Zero-Shot Cross-Lingual Dependency Parsing through Contextual Embedding Transformation

fe1ixxu/ZeroShot-CrossLing-Parsing EACL (AdaptNLP) 2021

Linear embedding transformation has been shown to be effective for zero-shot cross-lingual transfer tasks and achieve surprisingly promising results.