Search Results for author: Jianzhi Shao

Found 1 papers, 1 papers with code

Dialogue Distillation: Open-Domain Dialogue Augmentation Using Unpaired Data

1 code implementation EMNLP 2020 Rongsheng Zhang, Yinhe Zheng, Jianzhi Shao, Xiaoxi Mao, Yadong Xi, Minlie Huang

Further, a model-level distillation process is employed to distill a teacher model trained on high-quality paired data to augmented dialogue pairs, thereby preventing dialogue models from being affected by the noise in the augmented data.

Data Augmentation

Cannot find the paper you are looking for? You can Submit a new open access paper.