We present a memory augmented neural network for natural language understanding: Neural Semantic Encoders.
#11 best model for Question Answering on WikiQA
The dominant approach for many NLP tasks are recurrent neural networks, in particular LSTMs, and convolutional neural networks.
#14 best model for Text Classification on DBpedia
We study the topmost weight matrix of neural network language models.
We describe EmoBank, a corpus of 10k English sentences balancing multiple genres, which we annotated with dimensional emotion metadata in the Valence-Arousal-Dominance (VAD) representation format.
In this work, we investigate several neural network architectures for fine-grained entity type classification.
Our goal is to combine the rich multistep inference of symbolic logical reasoning with the generalization capabilities of neural networks.
We present a new parallel corpus, JHU FLuency-Extended GUG corpus (JFLEG) for developing and evaluating grammatical error correction (GEC).