A Character-Level Decoder without Explicit Segmentation for Neural Machine Translation

ACL 2016  ·  Junyoung Chung, Kyunghyun Cho, Yoshua Bengio ·

The existing machine translation systems, whether phrase-based or neural, have relied almost exclusively on word-level modelling with explicit segmentation. In this paper, we ask a fundamental question: can neural machine translation generate a character sequence without any explicit segmentation? To answer this question, we evaluate an attention-based encoder-decoder with a subword-level encoder and a character-level decoder on four language pairs--En-Cs, En-De, En-Ru and En-Fi-- using the parallel corpora from WMT'15. Our experiments show that the models with a character-level decoder outperform the ones with a subword-level decoder on all of the four language pairs. Furthermore, the ensembles of neural models with a character-level decoder outperform the state-of-the-art non-neural machine translation systems on En-Cs, En-De and En-Fi and perform comparably on En-Ru.

PDF Abstract ACL 2016 PDF ACL 2016 Abstract

Datasets


Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Machine Translation WMT2015 English-German Enc-Dec Att (BPE) BLEU score 21.7 # 5
Machine Translation WMT2015 English-German Enc-Dec Att (char) BLEU score 23.5 # 3

Methods


No methods listed for this paper. Add relevant methods here