Dialogue Generation is a fundamental component for real-world virtual assistants such as Siri and Alexa. It is the text generation task that automatically generate a response given a post by the user.
|TREND||DATASET||BEST METHOD||PAPER TITLE||PAPER||CODE||COMPARE|
One challenge for dialogue agents is recognizing feelings in the conversation partner and replying accordingly, a key communicative skill.
Chit-chat models are known to have several problems: they lack specificity, do not display a consistent personality and are often not very captivating.
Ranked #4 on Dialogue Generation on Persona-Chat
Current pre-training works in natural language generation pay little attention to the problem of exposure bias on downstream tasks.
Ranked #1 on Text Summarization on GigaWord-10k (using extra training data)
We introduce a new approach to generative data-driven dialogue systems (e. g. chatbots) called TransferTransfo which is a combination of a Transfer learning based training scheme and a high-capacity Transformer model.
Ranked #2 on Dialogue Generation on Persona-Chat
However, previous work in dialogue response generation has shown that these metrics do not correlate strongly with human judgment in the non task-oriented dialogue setting.
Pre-training models have been proved effective for a wide range of natural language processing tasks.
The cleaned dataset and the pre-training models will facilitate the research of short-text conversation modeling.