Search Results for author: Chongyang Tao

Found 36 papers, 14 papers with code

There Are a Thousand Hamlets in a Thousand People’s Eyes: Enhancing Knowledge-grounded Dialogue with Personal Memory

no code implementations ACL 2022 Tingchen Fu, Xueliang Zhao, Chongyang Tao, Ji-Rong Wen, Rui Yan

Knowledge-grounded conversation (KGC) shows great potential in building an engaging and knowledgeable chatbot, and knowledge selection is a key ingredient in it.

Chatbot

ProphetChat: Enhancing Dialogue Generation with Simulation of Future Conversation

no code implementations ACL 2022 Chang Liu, Xu Tan, Chongyang Tao, Zhenxin Fu, Dongyan Zhao, Tie-Yan Liu, Rui Yan

To enable the chatbot to foresee the dialogue future, we design a beam-search-like roll-out strategy for dialogue future simulation using a typical dialogue generation model and a dialogue selector.

Dialogue Generation Response Generation

Learning to Express in Knowledge-Grounded Conversation

no code implementations NeurIPS 2021 Xueliang Zhao, Tingchen Fu, Chongyang Tao, Wei Wu, Dongyan Zhao, Rui Yan

Grounding dialogue generation by extra knowledge has shown great potentials towards building a system capable of replying with knowledgeable and engaging responses.

Dialogue Generation

HeterMPC: A Heterogeneous Graph Neural Network for Response Generation in Multi-Party Conversations

1 code implementation ACL 2022 Jia-Chen Gu, Chao-Hong Tan, Chongyang Tao, Zhen-Hua Ling, Huang Hu, Xiubo Geng, Daxin Jiang

To address these challenges, we present HeterMPC, a heterogeneous graph-based neural network for response generation in MPCs which models the semantics of utterances and interlocutors simultaneously with two types of nodes in a graph.

Response Generation

PCL: Peer-Contrastive Learning with Diverse Augmentations for Unsupervised Sentence Embeddings

no code implementations28 Jan 2022 Qiyu Wu, Chongyang Tao, Tao Shen, Can Xu, Xiubo Geng, Daxin Jiang

A straightforward solution is resorting to more diverse positives from a multi-augmenting strategy, while an open question remains about how to unsupervisedly learn from the diverse positives but with uneven augmenting qualities in the text field.

Contrastive Learning Sentence Embeddings

Building an Efficient and Effective Retrieval-based Dialogue System via Mutual Learning

no code implementations1 Oct 2021 Chongyang Tao, Jiazhan Feng, Chang Liu, Juntao Li, Xiubo Geng, Daxin Jiang

For this task, the adoption of pre-trained language models (such as BERT) has led to remarkable progress in a number of benchmarks.

Re-Ranking

Response Ranking with Multi-types of Deep Interactive Representations in Retrieval-based Dialogues

1 code implementation ACM Transactions on Information Systems 2021 Ruijian Xu, Chongyang Tao, Jiazhan Feng, Wei Wu, Rui Yan, Dongyan Zhao

To tackle these challenges, we propose a representation[K]-interaction[L]-matching framework that explores multiple types of deep interactive representations to build context-response matching models for response selection.

Conversational Response Selection

A Pre-training Strategy for Zero-Resource Response Selection in Knowledge-Grounded Conversations

no code implementations ACL 2021 Chongyang Tao, Changyu Chen, Jiazhan Feng, Ji-Rong Wen, Rui Yan

Recently, many studies are emerging towards building a retrieval-based dialogue system that is able to effectively leverage background knowledge (e. g., documents) when conversing with humans.

Language Modelling

Neural Rule-Execution Tracking Machine For Transformer-Based Text Generation

no code implementations NeurIPS 2021 YuFei Wang, Can Xu, Huang Hu, Chongyang Tao, Stephen Wan, Mark Dras, Mark Johnson, Daxin Jiang

Sequence-to-Sequence (S2S) neural text generation models, especially the pre-trained ones (e. g., BART and T5), have exhibited compelling performance on various natural language generation tasks.

Text Generation

MPC-BERT: A Pre-Trained Language Model for Multi-Party Conversation Understanding

1 code implementation ACL 2021 Jia-Chen Gu, Chongyang Tao, Zhen-Hua Ling, Can Xu, Xiubo Geng, Daxin Jiang

Recently, various neural models for multi-party conversation (MPC) have achieved impressive improvements on a variety of tasks such as addressee recognition, speaker identification and response prediction.

Language Modelling Speaker Identification

Learning to Organize a Bag of Words into Sentences with Neural Networks: An Empirical Study

no code implementations NAACL 2021 Chongyang Tao, Shen Gao, Juntao Li, Yansong Feng, Dongyan Zhao, Rui Yan

Sequential information, a. k. a., orders, is assumed to be essential for processing a sequence with recurrent neural network or convolutional neural network based encoders.

Maria: A Visual Experience Powered Conversational Agent

1 code implementation ACL 2021 Zujie Liang, Huang Hu, Can Xu, Chongyang Tao, Xiubo Geng, Yining Chen, Fan Liang, Daxin Jiang

The retriever aims to retrieve a correlated image to the dialog from an image index, while the visual concept detector extracts rich visual knowledge from the image.

A Benchmarking on Cloud based Speech-To-Text Services for French Speech and Background Noise Effect

no code implementations7 May 2021 Binbin Xu, Chongyang Tao, Zidu Feng, Youssef Raqui, Sylvie Ranwez

This study presents a large scale benchmarking on cloud based Speech-To-Text systems: {Google Cloud Speech-To-Text}, {Microsoft Azure Cognitive Services}, {Amazon Transcribe}, {IBM Watson Speech to Text}.

Dialogue History Matters! Personalized Response Selectionin Multi-turn Retrieval-based Chatbots

no code implementations17 Mar 2021 Juntao Li, Chang Liu, Chongyang Tao, Zhangming Chan, Dongyan Zhao, Min Zhang, Rui Yan

To fill the gap between these up-to-date methods and the real-world applications, we incorporate user-specific dialogue history into the response selection and propose a personalized hybrid matching network (PHMN).

Representation Learning

Learning an Effective Context-Response Matching Model with Self-Supervised Tasks for Retrieval-based Dialogues

no code implementations14 Sep 2020 Ruijian Xu, Chongyang Tao, Daxin Jiang, Xueliang Zhao, Dongyan Zhao, Rui Yan

To address these issues, in this paper, we propose learning a context-response matching model with auxiliary self-supervised tasks designed for the dialogue data based on pre-trained language models.

Conversational Response Selection

Zero-Resource Knowledge-Grounded Dialogue Generation

1 code implementation NeurIPS 2020 Linxiao Li, Can Xu, Wei Wu, Yufan Zhao, Xueliang Zhao, Chongyang Tao

While neural conversation models have shown great potentials towards generating informative and engaging responses via introducing external knowledge, learning such a model often requires knowledge-grounded dialogues that are difficult to obtain.

Dialogue Generation

EnsembleGAN: Adversarial Learning for Retrieval-Generation Ensemble Model on Short-Text Conversation

no code implementations30 Apr 2020 Jiayi Zhang, Chongyang Tao, Zhenjing Xu, Qiaojing Xie, Wei Chen, Rui Yan

Aiming at generating responses that approximate the ground-truth and receive high ranking scores from the discriminator, the two generators learn to generate improved highly relevant responses and competitive unobserved candidates respectively, while the discriminative ranker is trained to identify true responses from adversarial ones, thus featuring the merits of both generator counterparts.

Language Modelling Short-Text Conversation

Low-Resource Knowledge-Grounded Dialogue Generation

no code implementations ICLR 2020 Xueliang Zhao, Wei Wu, Chongyang Tao, Can Xu, Dongyan Zhao, Rui Yan

In such a low-resource setting, we devise a disentangled response decoder in order to isolate parameters that depend on knowledge-grounded dialogues from the entire generation model.

Dialogue Generation Response Generation

Mimicking Human Process: Text Representation via Latent Semantic Clustering for Classification

no code implementations18 Jun 2019 Xiaoye Tan, Rui Yan, Chongyang Tao, Mingrui Wu

Considering that words with different characteristic in the text have different importance for classification, grouping them together separately can strengthen the semantic expression of each part.

Classification General Classification

A Document-grounded Matching Network for Response Selection in Retrieval-based Chatbots

no code implementations11 Jun 2019 Xueliang Zhao, Chongyang Tao, Wei Wu, Can Xu, Dongyan Zhao, Rui Yan

We present a document-grounded matching network (DGMN) for response selection that can power a knowledge-aware retrieval-based chatbot system.

Chatbot

Iterative Document Representation Learning Towards Summarization with Polishing

1 code implementation EMNLP 2018 Xiuying Chen, Shen Gao, Chongyang Tao, Yan Song, Dongyan Zhao, Rui Yan

In this paper, we introduce Iterative Text Summarization (ITS), an iteration-based model for supervised extractive text summarization, inspired by the observation that it is often necessary for a human to read an article multiple times in order to fully understand and summarize its contents.

Extractive Text Summarization Representation Learning

Improving Matching Models with Hierarchical Contextualized Representations for Multi-turn Response Selection

no code implementations22 Aug 2018 Chongyang Tao, Wei Wu, Can Xu, Yansong Feng, Dongyan Zhao, Rui Yan

In this paper, we study context-response matching with pre-trained contextualized representations for multi-turn response selection in retrieval-based chatbots.

Dialogue Generation

Tree2Tree Learning with Memory Unit

no code implementations ICLR 2018 Ning Miao, Hengliang Wang, Ran Le, Chongyang Tao, Mingyue Shang, Rui Yan, Dongyan Zhao

Traditional recurrent neural network (RNN) or convolutional neural net- work (CNN) based sequence-to-sequence model can not handle tree structural data well.

Machine Translation Translation

RUBER: An Unsupervised Method for Automatic Evaluation of Open-Domain Dialog Systems

1 code implementation11 Jan 2017 Chongyang Tao, Lili Mou, Dongyan Zhao, Rui Yan

Open-domain human-computer conversation has been attracting increasing attention over the past few years.

Dialogue Evaluation

Cannot find the paper you are looking for? You can Submit a new open access paper.