Convolutional Sequence to Sequence Learning

ICML 2017 Jonas Gehring • Michael Auli • David Grangier • Denis Yarats • Yann N. Dauphin

The prevalent approach to sequence to sequence learning maps an input sequence to a variable length output sequence via recurrent neural networks. We introduce an architecture based entirely on convolutional neural networks. Compared to recurrent models, computations over all elements can be fully parallelized during training and optimization is easier since the number of non-linearities is fixed and independent of the input length.

Full paper

Evaluation


Task Dataset Model Metric name Metric value Global rank Compare
Machine Translation IWSLT2015 English-German ConvS2S BLEU score 26.73 # 5
Machine Translation IWSLT2015 German-English ConvS2S BLEU score 32.31 # 8
Machine Translation WMT2014 English-French ConvS2S BLEU score 40.46 # 12
Machine Translation WMT2014 English-French ConvS2S (ensemble) BLEU score 41.29 # 8
Machine Translation WMT2014 English-German ConvS2S BLEU score 25.16 # 19
Machine Translation WMT2014 English-German ConvS2S (ensemble) BLEU score 26.36 # 15
Machine Translation WMT2016 English-Romanian ConvS2S BPE40k BLEU score 29.88 # 1