Large-scale conversation models are turning to leveraging external knowledge to improve the factual accuracy in response generation.
Many open-domain dialogue models pre-trained with social media comments can generate coherent replies but have difficulties producing engaging responses when interacting with real users.
In recent years, Internet memes have been widely used in online chatting.
no code implementations • 23 Dec 2021 • Xin Tian, Xinxian Huang, Dongfeng He, Yingzhan Lin, Siqi Bao, Huang He, Liankai Huang, Qiang Ju, Xiyuan Zhang, Jian Xie, Shuqi Sun, Fan Wang, Hua Wu, Haifeng Wang
Task-oriented dialogue systems have been plagued by the difficulties of obtaining large-scale and high-quality annotated conversations.
In this paper, we propose a novel Amendable Generation for Dialogue State Tracking (AG-DST), which contains a two-pass generation process: (1) generating a primitive dialogue state based on the dialogue of the current turn and the previous dialogue state, and (2) amending the primitive dialogue state from the first pass.
Ranked #1 on Dialogue State Tracking on Wizard-of-Oz
To explore the limit of dialogue generation pre-training, we present the models of PLATO-XL with up to 11 billion parameters, trained on both Chinese and English social media conversations.
In this work, we explore the application of PLATO-2 on various dialogue systems, including open-domain conversation, knowledge grounded dialogue, and task-oriented conversation.
The Track-1 of DSTC9 aims to effectively answer user requests or questions during task-oriented dialogues, which are out of the scope of APIs/DB.
To build a high-quality open-domain chatbot, we introduce the effective training process of PLATO-2 via curriculum learning.
Pre-training models have been proved effective for a wide range of natural language processing tasks.
In this paper, a novel Generation-Evaluation framework is developed for multi-turn conversations with the objective of letting both participants know more about each other.
In order to relieve burdens of software engineers without knowledge of Bayesian networks, Familia is able to conduct automatic parameter inference for a variety of topic models.