Deconvolution-Based Global Decoding for Neural Machine Translation

COLING 2018 Junyang LinXu SunXuancheng RenShuming MaJinsong SuQi Su

A great proportion of sequence-to-sequence (Seq2Seq) models for Neural Machine Translation (NMT) adopt Recurrent Neural Network (RNN) to generate translation word by word following a sequential order. As the studies of linguistics have proved that language is not linear word sequence but sequence of complex structure, translation at each step should be conditioned on the whole target-side context... (read more)

PDF Abstract COLING 2018 PDF COLING 2018 Abstract
TASK DATASET MODEL METRIC NAME METRIC VALUE GLOBAL RANK RESULT BENCHMARK
Machine Translation IWSLT2015 English-Vietnamese DeconvDec BLEU 28.47 # 7

Methods used in the Paper


METHOD TYPE
🤖 No Methods Found Help the community by adding them if they're not listed; e.g. Deep Residual Learning for Image Recognition uses ResNet