Zero-Shot Cross-Lingual Transfer
72 papers with code • 2 benchmarks • 4 datasets
Libraries
Use these libraries to find Zero-Shot Cross-Lingual Transfer models and implementationsMost implemented papers
Multilingual Multimodal Pre-training for Zero-Shot Cross-Lingual Transfer of Vision-Language Models
Specifically, we focus on multilingual text-to-video search and propose a Transformer-based model that learns contextualized multilingual multimodal embeddings.
Improving Zero-Shot Cross-Lingual Transfer Learning via Robust Training
Pre-trained multilingual language encoders, such as multilingual BERT and XLM-R, show great potential for zero-shot cross-lingual transfer.
Zero-shot Cross-lingual Transfer of Neural Machine Translation with Multilingual Pretrained Encoders
In this paper, we focus on a zero-shot cross-lingual transfer task in NMT.
X-METRA-ADA: Cross-lingual Meta-Transfer Learning Adaptation to Natural Language Understanding and Question Answering
We extensively evaluate our framework on two challenging cross-lingual NLU tasks: multilingual task-oriented dialog and typologically diverse question answering.
Bilingual Alignment Pre-Training for Zero-Shot Cross-Lingual Transfer
Multilingual pre-trained models have achieved remarkable performance on cross-lingual transfer learning.
Compositional Generalization in Multilingual Semantic Parsing over Wikidata
We introduce such a dataset, which we call Multilingual Compositional Wikidata Questions (MCWQ), and use it to analyze the compositional generalization of semantic parsers in Hebrew, Kannada, Chinese and English.
MultiEURLEX -- A multi-lingual and multi-label legal document classification dataset for zero-shot cross-lingual transfer
We use the dataset as a testbed for zero-shot cross-lingual transfer, where we exploit annotated training documents in one language (source) to classify documents in another language (target).
Similarity of Sentence Representations in Multilingual LMs: Resolving Conflicting Literature and Case Study of Baltic Languages
However, we observe that Baltic languages do belong to that shared space.
xGQA: Cross-Lingual Visual Question Answering
In this work, we address this gap and provide xGQA, a new multilingual evaluation benchmark for the visual question answering task.
Zero-Shot Cross-Lingual Transfer in Legal Domain Using Transformer Models
Also, Gradual unfreezing of pre-trained model's layers during training results in relative improvement of 38-45% for French and 58-70% for German.