This paper proposes to tackle open- domain question answering using Wikipedia as the unique knowledge source: the answer to any factoid question is a text span in a Wikipedia article.
#115 best model for Question Answering on SQuAD1.1
The prevalent approach to neural machine translation relies on bi-directional LSTMs to encode the source sentence.
#6 best model for Machine Translation on WMT2016 English-Romanian
End-to-end learning of recurrent neural networks (RNNs) is an attractive solution for dialog systems; however, current techniques are data-intensive and require thousands of dialogs to learn simple behaviors.
We describe an open-source toolkit for neural machine translation (NMT).
Neural sequence-to-sequence models have provided a viable new approach for abstractive text summarization (meaning they are not restricted to simply selecting and rearranging passages from the original text).
#2 best model for Abstractive Text Summarization on CNN / Daily Mail
We introduce a new deep learning model for semantic role labeling (SRL) that significantly improves the state of the art, along with detailed analyses to reveal its strengths and limitations.
#2 best model for Predicate Detection on CoNLL 2005
Harnessing the statistical power of neural networks to perform language understanding and symbolic reasoning is difficult, when it requires executing efficient discrete operations against a large knowledge-base.