In this paper, we introduce a synchronous bidirectional neural machine translation (SB-NMT) that predicts its outputs using left-to-right and right-to-left decoding simultaneously and interactively, in order to leverage both of the history and future information at the same time.
#11 best model for Machine Translation on WMT2014 English-German
We propose human-in-the-loop adversarial generation, where human authors are guided to break models.
This paper presents an unsupervised framework for jointly modeling topic content and discourse behavior in microblog conversations.
Building meaningful phrase representations is challenging because phrase meanings are not simply the sum of their constituent meanings.
Our approach decouples learning the transformation from the source language to the target language into (a) learning rotations for language-specific embeddings to align them to a common space, and (b) learning a similarity metric in the common space to model similarities between the embeddings.
It is intuitive that semantic representations can be useful for machine translation, mainly because they can help in enforcing meaning preservation and handling data sparsity (many sentences correspond to one meaning) of machine translation models.
From our extensive evaluation of 20 architectures, we report a highest score of 71. 6% F1 for the segmentation and classification of 30 topics from the English city domain, scored by our SECTOR LSTM model with bloom filter embeddings and bidirectional segmentation.
In this work, we propose an end-to-end trainable method for neural goal-oriented dialog systems which handles new user behaviors at deployment by transferring the dialog to a human agent intelligently.