Unsupervised Neural Machine Translation

ICLR 2018 Mikel Artetxe • Gorka Labaka • Eneko Agirre • Kyunghyun Cho

In spite of the recent success of neural machine translation (NMT) in standard benchmarks, the lack of large parallel corpora poses a major practical problem for many language pairs. There have been several proposals to alleviate this issue with, for instance, triangulation and semi-supervised learning techniques, but they still require a strong cross-lingual signal. In this work, we completely remove the need of parallel data and propose a novel method to train an NMT system in a completely unsupervised manner, relying on nothing but monolingual corpora.

Full paper

Evaluation


Task Dataset Model Metric name Metric value Global rank Compare
Machine Translation WMT2014 English-French Unsupervised attentional encoder-decoder + BPE BLEU score 14.36 # 31
Machine Translation WMT2015 English-German Unsupervised attentional encoder-decoder + BPE BLEU score 6.89 # 6