Zero-Shot Machine Translation
2 papers with code • 0 benchmarks • 1 datasets
Translate text or speech from one language to another without supervision.
Benchmarks
These leaderboards are used to track progress in Zero-Shot Machine Translation
Latest papers with no code
EBBS: An Ensemble with Bi-Level Beam Search for Zero-Shot Machine Translation
The ability of zero-shot translation emerges when we train a multilingual model with certain translation directions; the model can then directly translate in unseen directions.
CharSpan: Utilizing Lexical Similarity to Enable Zero-Shot Machine Translation for Extremely Low-resource Languages
We address the task of machine translation (MT) from extremely low-resource language (ELRL) to English by leveraging cross-lingual transfer from 'closely-related' high-resource language (HRL).
Investigating Massive Multilingual Pre-Trained Machine Translation Models for Clinical Domain via Transfer Learning
To the best of our knowledge, this is the first work on using MMPLMs towards \textit{clinical domain transfer-learning NMT} successfully for totally unseen languages during pre-training.
MALM: Mixing Augmented Language Modeling for Zero-Shot Machine Translation
We empirically demonstrate the effectiveness of self-supervised pre-training and data augmentation for zero-shot multi-lingual machine translation.
Cross-lingual Word Embeddings beyond Zero-shot Machine Translation
We explore the transferability of a multilingual neural machine translation model to unseen languages when the transfer is grounded solely on the cross-lingual word embeddings.