Zero-Shot Machine Translation

2 papers with code • 0 benchmarks • 1 datasets

Translate text or speech from one language to another without supervision.

Datasets


Latest papers with no code

EBBS: An Ensemble with Bi-Level Beam Search for Zero-Shot Machine Translation

no code yet • 29 Feb 2024

The ability of zero-shot translation emerges when we train a multilingual model with certain translation directions; the model can then directly translate in unseen directions.

CharSpan: Utilizing Lexical Similarity to Enable Zero-Shot Machine Translation for Extremely Low-resource Languages

no code yet • 9 May 2023

We address the task of machine translation (MT) from extremely low-resource language (ELRL) to English by leveraging cross-lingual transfer from 'closely-related' high-resource language (HRL).

Investigating Massive Multilingual Pre-Trained Machine Translation Models for Clinical Domain via Transfer Learning

no code yet • 12 Oct 2022

To the best of our knowledge, this is the first work on using MMPLMs towards \textit{clinical domain transfer-learning NMT} successfully for totally unseen languages during pre-training.

MALM: Mixing Augmented Language Modeling for Zero-Shot Machine Translation

no code yet • 1 Oct 2022

We empirically demonstrate the effectiveness of self-supervised pre-training and data augmentation for zero-shot multi-lingual machine translation.

Cross-lingual Word Embeddings beyond Zero-shot Machine Translation

no code yet • 3 Nov 2020

We explore the transferability of a multilingual neural machine translation model to unseen languages when the transfer is grounded solely on the cross-lingual word embeddings.