Multilingual Neural Machine Translation With Soft Decoupled Encoding

ICLR 2019 Xinyi Wang • Hieu Pham • Philip Arthur • Graham Neubig

Multilingual training of neural machine translation (NMT) systems has led to impressive accuracy improvements on low-resource languages. However, there are still significant challenges in efficiently learning word representations in the face of paucity of data. In this paper, we propose Soft Decoupled Encoding (SDE), a multilingual lexicon encoding framework specifically designed to share lexical-level information intelligently without requiring heuristic preprocessing such as pre-segmenting the data.

Full paper

Evaluation


No evaluation results yet. Help compare this paper to other papers by submitting the tasks and evaluation metrics from the paper.