Phrase-Based & Neural Unsupervised Machine Translation

Machine translation systems achieve near human-level performance on some languages, yet their effectiveness strongly relies on the availability of large amounts of parallel sentences, which hinders their applicability to the majority of language pairs. This work investigates how to learn to translate when having access to only large monolingual corpora in each language. We propose two model variants, a neural and a phrase-based model. Both versions leverage a careful initialization of the parameters, the denoising effect of language models and automatic generation of parallel data by iterative back-translation. These models are significantly better than methods from the literature, while being simpler and having fewer hyper-parameters. On the widely used WMT'14 English-French and WMT'16 German-English benchmarks, our models respectively obtain 28.1 and 25.2 BLEU points without using a single parallel sentence, outperforming the state of the art by more than 11 BLEU points. On low-resource languages like English-Urdu and English-Romanian, our methods achieve even better results than semi-supervised and supervised approaches leveraging the paucity of available bitexts. Our code for NMT and PBSMT is publicly available.

PDF Abstract
Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Unsupervised Machine Translation WMT2014 English-French PBSMT + NMT BLEU 27.6 # 7
Machine Translation WMT2014 English-French Unsupervised PBSMT BLEU score 28.11 # 51
Hardware Burden None # 1
Operations per network pass None # 1
Machine Translation WMT2014 English-French Unsupervised NMT + Transformer BLEU score 25.14 # 55
Machine Translation WMT2014 English-French PBSMT + NMT BLEU score 27.6 # 52
Machine Translation WMT2014 English-German Unsupervised NMT + Transformer BLEU score 17.16 # 88
Hardware Burden None # 1
Operations per network pass None # 1
Machine Translation WMT2014 English-German PBSMT + NMT BLEU score 20.23 # 82
Hardware Burden None # 1
Operations per network pass None # 1
Machine Translation WMT2014 English-German Unsupervised PBSMT BLEU score 17.94 # 86
Hardware Burden None # 1
Operations per network pass None # 1
Unsupervised Machine Translation WMT2014 French-English PBSMT + NMT BLEU 27.7 # 6
Unsupervised Machine Translation WMT2016 English-German PBSMT + NMT BLEU 20.2 # 6
Machine Translation WMT2016 English-Romanian Unsupervised NMT + Transformer BLEU score 21.18 # 18
Machine Translation WMT2016 English-Romanian Unsupervised PBSMT BLEU score 21.33 # 17
Machine Translation WMT2016 English-Romanian PBSMT + NMT BLEU score 25.13 # 16
Machine Translation WMT2016 English-Russian Unsupervised PBSMT BLEU score 13.37 # 3
Machine Translation WMT2016 English-Russian PBSMT + NMT BLEU score 13.76 # 2
Machine Translation WMT2016 English-Russian Unsupervised NMT + Transformer BLEU score 7.98 # 4
Unsupervised Machine Translation WMT2016 German-English PBSMT BLEU 25.2 # 7

Methods


No methods listed for this paper. Add relevant methods here