Because of their superior ability to preserve sequence information over time, Long Short-Term Memory (LSTM) networks, a type of recurrent neural network with a more complex computational unit, have obtained strong results on a variety of sequence modeling tasks.
The models trained by the proposed approach are empirically found to outperform the baseline models with a small vocabulary as well as the LSTM-based neural machine translation models.
Our experiments on the WMT14 English to French translation task show that this method provides a substantial improvement of up to 2. 8 BLEU points over an equivalent NMT system that does not use this technique.
#16 best model for Machine Translation on WMT2014 English-French
We propose a technique for learning representations of parser states in transition-based dependency parsers.
We propose Neural Responding Machine (NRM), a neural network-based response generator for Short-Text Conversation.