An Effective Approach to Unsupervised Machine Translation

ACL 2019  ·  Mikel Artetxe, Gorka Labaka, Eneko Agirre ·

While machine translation has traditionally relied on large amounts of parallel corpora, a recent research line has managed to train both Neural Machine Translation (NMT) and Statistical Machine Translation (SMT) systems using monolingual corpora only. In this paper, we identify and address several deficiencies of existing unsupervised SMT approaches by exploiting subword information, developing a theoretically well founded unsupervised tuning method, and incorporating a joint refinement procedure. Moreover, we use our improved SMT system to initialize a dual NMT model, which is further fine-tuned through on-the-fly back-translation. Together, we obtain large improvements over the previous state-of-the-art in unsupervised machine translation. For instance, we get 22.5 BLEU points in English-to-German WMT 2014, 5.5 points more than the previous best unsupervised system, and 0.5 points more than the (supervised) shared task winner back in 2014.

PDF Abstract ACL 2019 PDF ACL 2019 Abstract
Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Unsupervised Machine Translation WMT2014 English-French SMT + NMT (tuning and joint refinement) BLEU 36.2 # 3
Unsupervised Machine Translation WMT2014 English-German SMT + NMT (tuning and joint refinement) BLEU 22.5 # 1
Unsupervised Machine Translation WMT2014 French-English SMT + NMT (tuning and joint refinement) BLEU 33.5 # 3
Unsupervised Machine Translation WMT2014 German-English SMT + NMT (tuning and joint refinement) BLEU 27.0 # 1
Unsupervised Machine Translation WMT2016 English-German SMT + NMT (tuning and joint refinement) BLEU 26.9 # 3
Unsupervised Machine Translation WMT2016 German-English SMT + NMT (tuning and joint refinement) BLEU 34.4 # 3

Methods


No methods listed for this paper. Add relevant methods here