Unsupervised Machine Translation

32 papers with code • 9 benchmarks • 4 datasets

Unsupervised machine translation is the task of doing machine translation without any translation resources at training time.

( Image credit: Phrase-Based & Neural Unsupervised Machine Translation )

Libraries

Use these libraries to find Unsupervised Machine Translation models and implementations

Latest papers with no code

Integrating Unsupervised Data Generation into Self-Supervised Neural Machine Translation for Low-Resource Languages

no code yet • MTSummit 2021

For most language combinations, parallel data is either scarce or simply unavailable.

On Systematic Style Differences between Unsupervised and Supervised MT and an Application for High-Resource Machine Translation

no code yet • NAACL 2022

Modern unsupervised machine translation (MT) systems reach reasonable translation quality under clean and controlled data conditions.

Crosslingual Embeddings are Essential in UNMT for Distant Languages: An English to IndoAryan Case Study

no code yet • MTSummit 2021

In this paper, we show that initializing the embedding layer of UNMT models with cross-lingual embeddings shows significant improvements in BLEU score over existing approaches with embeddings randomly initialized.

Unsupervised Multilingual Sentence Embeddings for Parallel Corpus Mining

no code yet • ACL 2020

Existing models of multilingual sentence embeddings require large parallel data resources which are not available for low-resource languages.

Backretrieval: An Image-Pivoted Evaluation Metric for Cross-Lingual Text Representations Without Parallel Corpora

no code yet • 11 May 2021

Cross-lingual text representations have gained popularity lately and act as the backbone of many tasks such as unsupervised machine translation and cross-lingual information retrieval, to name a few.

Unsupervised Machine Translation On Dravidian Languages

no code yet • EACL (DravidianLangTech) 2021

We show that unifying the writing systems is essential in unsupervised translation between the Dravidian languages.

From Unsupervised Machine Translation To Adversarial Text Generation

no code yet • 10 Nov 2020

B-GAN is able to generate a distributed latent space representation which can be paired with an attention based decoder to generate fluent sentences.

Unsupervised Neural Machine Translation for Low-Resource Domains via Meta-Learning

no code yet • ACL 2021

To address this issue, this paper presents a novel meta-learning algorithm for unsupervised neural machine translation (UNMT) that trains the model to adapt to another domain by utilizing only a small amount of training data.

SJTU-NICT's Supervised and Unsupervised Neural Machine Translation Systems for the WMT20 News Translation Task

no code yet • 11 Oct 2020

In this paper, we introduced our joint team SJTU-NICT 's participation in the WMT 2020 machine translation shared task.

Harnessing Multilinguality in Unsupervised Machine Translation for Rare Languages

no code yet • NAACL 2021

We outperform all current state-of-the-art unsupervised baselines for these languages, achieving gains of up to 14. 4 BLEU.