Transformer-based language models (LMs) pretrained on large text collections are proven to store a wealth of semantic knowledge.
We introduce Span-ConveRT, a light-weight model for dialog slot-filling which frames the task as a turn-based span extraction task.
We present PolyResponse, a conversational search engine that supports task-oriented dialogue.
Despite their popularity in the chatbot literature, retrieval-based models have had modest impact on task-oriented dialogue systems, with the main obstacle to their application being the low-data regime of most task-oriented dialogue tasks.
Progress in Machine Learning is often driven by the availability of large datasets, and consistent evaluation metrics for comparing modeling approaches.
We propose a new attention mechanism for neural based question answering, which depends on varying granularities of the input.