88 papers with code • 9 benchmarks • 12 datasets
Dialogue Generation is a fundamental component for real-world virtual assistants such as Siri and Alexa. It is the text generation task that automatically generate a response given a post by the user.
In open-domain dialogue intelligent agents should exhibit the use of knowledge, however there are few convincing demonstrations of this to date.
One challenge for dialogue agents is recognizing feelings in the conversation partner and replying accordingly, a key communicative skill.
Chit-chat models are known to have several problems: they lack specificity, do not display a consistent personality and are often not very captivating.
Ranked #4 on Dialogue Generation on Persona-Chat
Current pre-training works in natural language generation pay little attention to the problem of exposure bias on downstream tasks.
Ranked #1 on Text Summarization on GigaWord-10k (using extra training data)
Pre-training models have been proved effective for a wide range of natural language processing tasks.
We introduce a new approach to generative data-driven dialogue systems (e. g. chatbots) called TransferTransfo which is a combination of a Transfer learning based training scheme and a high-capacity Transformer model.
Ranked #2 on Dialogue Generation on Persona-Chat
However, previous work in dialogue response generation has shown that these metrics do not correlate strongly with human judgment in the non task-oriented dialogue setting.