In this work, we introduce a model and beam-search training scheme, based on the work of Daume III and Marcu (2005), that extends seq2seq to learn global sequence scores.
#14 best model for Machine Translation on IWSLT2015 German-English
This paper explores the task of translating natural language queries into regular expressions which embody their meaning.
Approaches to multimodal pooling include element-wise product or sum, as well as concatenation of the visual and textual representations.
Ensembling and unknown word replacement add another 2 Bleu which brings the NMT performance on low-resource machine translation close to a strong syntax based machine translation (SBMT) system, exceeding its performance on one language pair.
Recent neural models of dialogue generation offer great promise for generating responses for conversational agents, but tend to be shortsighted, predicting utterances one at a time while ignoring their influence on future outcomes.