Unsupervised Neural Machine Translation Initialized by Unsupervised Statistical Machine Translation

30 Oct 2018  ·  Benjamin Marie, Atsushi Fujita ·

Recent work achieved remarkable results in training neural machine translation (NMT) systems in a fully unsupervised way, with new and dedicated architectures that rely on monolingual corpora only. In this work, we propose to define unsupervised NMT (UNMT) as NMT trained with the supervision of synthetic bilingual data. Our approach straightforwardly enables the use of state-of-the-art architectures proposed for supervised NMT by replacing human-made bilingual data with synthetic bilingual data for training. We propose to initialize the training of UNMT with synthetic bilingual data generated by unsupervised statistical machine translation (USMT). The UNMT system is then incrementally improved using back-translation. Our preliminary experiments show that our approach achieves a new state-of-the-art for unsupervised machine translation on the WMT16 German--English news translation task, for both translation directions.

PDF Abstract

Datasets


Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Unsupervised Machine Translation WMT2016 English-German Synthetic bilingual data init BLEU 20.0 # 7
Unsupervised Machine Translation WMT2016 German-English Synthetic bilingual data init BLEU 26.7 # 5

Methods


No methods listed for this paper. Add relevant methods here