Unsupervised Machine Translation

32 papers with code • 9 benchmarks • 4 datasets

Unsupervised machine translation is the task of doing machine translation without any translation resources at training time.

( Image credit: Phrase-Based & Neural Unsupervised Machine Translation )

Libraries

Use these libraries to find Unsupervised Machine Translation models and implementations

Latest papers with no code

Cross-Lingual Unsupervised Sentiment Classification with Multi-View Transfer Learning

no code yet • ACL 2020

Recent neural network models have achieved impressive performance on sentiment classification in English as well as other languages.

Data Augmentation with Unsupervised Machine Translation Improves the Structural Similarity of Cross-lingual Word Embeddings

no code yet • ACL 2021

Unsupervised cross-lingual word embedding (CLWE) methods learn a linear transformation matrix that maps two monolingual embedding spaces that are separately trained with monolingual corpora.

Unsupervised Multimodal Neural Machine Translation with Pseudo Visual Pivoting

no code yet • ACL 2020

In this paper, we investigate how to utilize visual content for disambiguation and promoting latent space alignment in unsupervised MMT.

Data-dependent Gaussian Prior Objective for Language Generation

no code yet • ICLR 2020

However, MLE focuses on once-to-all matching between the predicted sequence and gold-standard, consequently treating all incorrect predictions as being equally incorrect.

A Call for More Rigor in Unsupervised Cross-lingual Learning

no code yet • ACL 2020

We review motivations, definition, approaches, and methodology for unsupervised cross-lingual learning and call for a more rigorous position in each of them.

Semi-Supervised Text Simplification with Back-Translation and Asymmetric Denoising Autoencoders

no code yet • 30 Apr 2020

When modeling simple and complex sentences with autoencoders, we introduce different types of noise into the training process.

When Does Unsupervised Machine Translation Work?

no code yet • WMT (EMNLP) 2020

We additionally find that unsupervised MT performance declines when source and target languages use different scripts, and observe very poor performance on authentic low-resource language pairs.

Do all Roads Lead to Rome? Understanding the Role of Initialization in Iterative Back-Translation

no code yet • 28 Feb 2020

In this paper, we analyze the role that such initialization plays in iterative back-translation.

A Multilingual View of Unsupervised Machine Translation

no code yet • Findings of the Association for Computational Linguistics 2020

We present a probabilistic framework for multilingual neural machine translation that encompasses supervised and unsupervised setups, focusing on unsupervised translation.

Comparing Unsupervised Word Translation Methods Step by Step

no code yet • NeurIPS 2019

Cross-lingual word vector space alignment is the task of mapping the vocabularies of two languages into a shared semantic space, which can be used for dictionary induction, unsupervised machine translation, and transfer learning.