We present a memory augmented neural network for natural language understanding: Neural Semantic Encoders.
#10 best model for Question Answering on WikiQA
We study the topmost weight matrix of neural network language models.
We approach the recognition of textual entailment using logical semantic representations and a theorem prover.
In this work, we investigate several neural network architectures for fine-grained entity type classification.
We describe EmoBank, a corpus of 10k English sentences balancing multiple genres, which we annotated with dimensional emotion metadata in the Valence-Arousal-Dominance (VAD) representation format.
We present a new parallel corpus, JHU FLuency-Extended GUG corpus (JFLEG) for developing and evaluating grammatical error correction (GEC).
Multi-task learning (MTL) in deep neural networks for NLP has recently received increasing interest due to some compelling benefits, including its potential to efficiently regularize models and to reduce the need for labeled data.