Dialogue Generation is a fundamental component for real-world virtual assistants such as Siri and Alexa. It is the text generation task that automatically generate a response given a post by the user.
Source: Adversarial Attacks on Deep Learning Models in Natural Language Processing: A Survey
Neural machine translation is a recently proposed approach to machine translation.
Ranked #3 on
Dialogue Generation
on Persona-Chat
Open-domain conversation models have become good at generating natural-sounding dialogue, using very large architectures with billions of trainable parameters.
One challenge for dialogue agents is recognizing feelings in the conversation partner and replying accordingly, a key communicative skill.
Chit-chat models are known to have several problems: they lack specificity, do not display a consistent personality and are often not very captivating.
Ranked #4 on
Dialogue Generation
on Persona-Chat
Current pre-training works in natural language generation pay little attention to the problem of exposure bias on downstream tasks.
Ranked #1 on
Text Summarization
on GigaWord-10k
(using extra training data)
ABSTRACTIVE TEXT SUMMARIZATION DIALOGUE GENERATION GENERATIVE QUESTION ANSWERING QUESTION GENERATION
The ability of a machine to communicate with humans has long been associated with the general success of AI.
We introduce a new approach to generative data-driven dialogue systems (e. g. chatbots) called TransferTransfo which is a combination of a Transfer learning based training scheme and a high-capacity Transformer model.
Ranked #2 on
Dialogue Generation
on Persona-Chat
However, previous work in dialogue response generation has shown that these metrics do not correlate strongly with human judgment in the non task-oriented dialogue setting.
Pre-training models have been proved effective for a wide range of natural language processing tasks.
The cleaned dataset and the pre-training models will facilitate the research of short-text conversation modeling.