Unsupervised machine translation is the task of doing machine translation without any translation resources at training time.
|Trend||Dataset||Best Method||Paper title||Paper||Code||Compare|
While machine translation has traditionally relied on large amounts of parallel corpora, a recent research line has managed to train both Neural Machine Translation (NMT) and Statistical Machine Translation (SMT) systems using monolingual corpora only.
In this work, we propose to define unsupervised NMT (UNMT) as NMT trained with the supervision of synthetic bilingual data.
Language style transferring rephrases text with specific stylistic attributes while preserving the original attribute-independent content.
Unsupervised machine translation---i.e., not assuming any cross-lingual supervision signal, whether a dictionary, translations, or comparable corpora---seems impossible, but nevertheless, Lample et al. (2018) recently proposed a fully unsupervised machine translation (MT) model.