This paper describes AllenNLP, a platform for research on deep learning methods in natural language understanding.
Experiments show that human performance is well above current state-of-the-art baseline systems, leaving plenty of room for the community to make improvements.
We introduce Baseline: a library for reproducible deep learning research and fast model development for NLP.
This paper introduces zero-shot dialog generation (ZSDG), as a step towards neural dialog systems that can instantly generalize to new situations with minimal data.
Deep learning models are often not easily adaptable to new tasks and require task-specific adjustments.
We propose a random walk model that is robust to this confound, where the probability of word generation is inversely related to the angular distance between the word and sentence embeddings.
Successful evidence-based medicine (EBM) applications rely on answering clinical questions by analyzing large medical literature databases.
We present a novel approach to learn representations for sentence-level semantic similarity using conversational data.