Conversational Response Generation
17 papers with code • 0 benchmarks • 6 datasets
Given an input conversation, generate a natural-looking text reply to the last conversation element.
Image credit: DIALOGPT : Large-Scale Generative Pre-training for Conversational Response Generation
Benchmarks
These leaderboards are used to track progress in Conversational Response Generation
Most implemented papers
A Diversity-Promoting Objective Function for Neural Conversation Models
Sequence-to-sequence neural network models for generation of conversational responses tend to generate safe, commonplace responses (e. g., "I don't know") regardless of the input.
MASS: Masked Sequence to Sequence Pre-training for Language Generation
Pre-training and fine-tuning, e. g., BERT, have achieved great success in language understanding by transferring knowledge from rich-resource pre-training task to the low/zero-resource downstream tasks.
DialoGPT: Large-Scale Generative Pre-training for Conversational Response Generation
We present a large, tunable neural conversational response generation model, DialoGPT (dialogue generative pre-trained transformer).
Generating Informative and Diverse Conversational Responses via Adversarial Information Maximization
Responses generated by neural conversational models tend to lack informativeness and diversity.
PALM: Pre-training an Autoencoding&Autoregressive Language Model for Context-conditioned Generation
An extensive set of experiments show that PALM achieves new state-of-the-art results on a variety of language generation benchmarks covering generative question answering (Rank 1 on the official MARCO leaderboard), abstractive summarization on CNN/DailyMail as well as Gigaword, question generation on SQuAD, and conversational response generation on Cornell Movie Dialogues.
Learning to Abstract for Memory-augmented Conversational Response Generation
In this work, we propose a memory-augmented generative model, which learns to abstract from the training corpus and saves the useful information to the memory to assist the response generation.
Conversations with Search Engines: SERP-based Conversational Response Generation
In this paper, we address the problem of answering complex information needs by conversing conversations with search engines, in the sense that users can express their queries in natural language, and directly receivethe information they need from a short system response in a conversational manner.
DIALOGPT : Large-Scale Generative Pre-training for Conversational Response Generation
We present a large, tunable neural conversational response generation model, DIALOGPT (dialogue generative pre-trained transformer).
PEDNet: A Persona Enhanced Dual Alternating Learning Network for Conversational Response Generation
However, generating personalized responses is still a challenging task since the leverage of predefined persona information is often insufficient.
DialogBERT: Discourse-Aware Response Generation via Learning to Recover and Rank Utterances
Recent advances in pre-trained language models have significantly improved neural response generation.