Deterministic Non-Autoregressive Neural Sequence Modeling by Iterative Refinement

EMNLP 2018  ·  Jason Lee, Elman Mansimov, Kyunghyun Cho ·

We propose a conditional non-autoregressive neural sequence model based on iterative refinement. The proposed model is designed based on the principles of latent variable models and denoising autoencoders, and is generally applicable to any sequence generation task. We extensively evaluate the proposed model on machine translation (En-De and En-Ro) and image caption generation, and observe that it significantly speeds up decoding while maintaining the generation quality comparable to the autoregressive counterpart.

PDF Abstract EMNLP 2018 PDF EMNLP 2018 Abstract
Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Machine Translation IWSLT2015 English-German Denoising autoencoders (non-autoregressive) BLEU score 27.01 # 5
Machine Translation IWSLT2015 German-English Denoising autoencoders (non-autoregressive) BLEU score 32.43 # 5
Machine Translation WMT2014 English-German Denoising autoencoders (non-autoregressive) BLEU score 21.54 # 76
Hardware Burden None # 1
Operations per network pass None # 1
Machine Translation WMT2014 German-English Denoising autoencoders (non-autoregressive) BLEU score 25.43 # 12
Machine Translation WMT2016 English-Romanian Denoising autoencoders (non-autoregressive) BLEU score 29.66 # 10
Machine Translation WMT2016 Romanian-English Denoising autoencoders (non-autoregressive) BLEU score 30.30 # 18

Methods


No methods listed for this paper. Add relevant methods here