Experiments show that human performance is well above current state-of-the-art baseline systems, leaving plenty of room for the community to make improvements.
We introduce Baseline: a library for reproducible deep learning research and fast model development for NLP.
This paper introduces zero-shot dialog generation (ZSDG), as a step towards neural dialog systems that can instantly generalize to new situations with minimal data.
Deep learning models are often not easily adaptable to new tasks and require task-specific adjustments.
We propose a random walk model that is robust to this confound, where the probability of word generation is inversely related to the angular distance between the word and sentence embeddings.
Various common deep learning architectures, such as LSTMs, GRUs, Resnets and Highway Networks, employ state passthrough connections that support training with high feed-forward depth or recurrence over many time steps.
Successful evidence-based medicine (EBM) applications rely on answering clinical questions by analyzing large medical literature databases.
Learning representations for knowledge base entities and concepts is becoming increasingly important for NLP applications.
To date there has been very little work on assessing discourse coherence methods on real-world data.