Unsupervised Neural Machine Translation with SMT as Posterior Regularization

14 Jan 2019Shuo Ren • Zhirui Zhang • Shujie Liu • Ming Zhou • Shuai Ma

Without real bilingual corpus available, unsupervised Neural Machine Translation (NMT) typically requires pseudo parallel data generated with the back-translation method for the model training. To address this issue, we introduce phrase based Statistic Machine Translation (SMT) models which are robust to noisy data, as posterior regularizations to guide the training of unsupervised NMT models in the iterative back-translation process. Our method starts from SMT models built with pre-trained language models and word-level translation tables inferred from cross-lingual embeddings.

Full paper

Evaluation


Task Dataset Model Metric name Metric value Global rank Compare
Unsupervised Machine Translation WMT2014 English-French SMT as posterior regularization BLEU 29.5 # 3
Unsupervised Machine Translation WMT2014 English-German SMT as posterior regularization BLEU 17.0 # 2
Unsupervised Machine Translation WMT2014 French-English SMT as posterior regularization BLEU 28.9 # 3
Unsupervised Machine Translation WMT2014 German-English SMT as posterior regularization BLEU 20.4 # 2
Unsupervised Machine Translation WMT2016 English-German SMT as posterior regularization BLEU 21.7 # 3
Unsupervised Machine Translation WMT2016 German-English SMT as posterior regularization BLEU 26.3 # 4