We present a memory augmented neural network for natural language understanding: Neural Semantic Encoders.
#8 best model for Question Answering on WikiQA
We study the topmost weight matrix of neural network language models.
We present a new parallel corpus, JHU FLuency-Extended GUG corpus (JFLEG) for developing and evaluating grammatical error correction (GEC).
Multi-task learning (MTL) in deep neural networks for NLP has recently received increasing interest due to some compelling benefits, including its potential to efficiently regularize models and to reduce the need for labeled data.
The fundamental role of hypernymy in NLP has motivated the development of many methods for the automatic identification of this relation, most of which rely on word distribution.
#7 best model for Hypernym Discovery on Music domain