Search Results for author: Junfei Liu

Found 4 papers, 0 papers with code

Combining Curriculum Learning and Knowledge Distillation for Dialogue Generation

no code implementations Findings (EMNLP) 2021 Qingqing Zhu, Xiuying Chen, Pengfei Wu, Junfei Liu, Dongyan Zhao

Hence, in this paper, we introduce a combination of curriculum learning and knowledge distillation for efficient dialogue generation models, where curriculum learning can help knowledge distillation from data and model aspects.

Dialogue Generation Knowledge Distillation +1

Learn with Noisy Data via Unsupervised Loss Correction for Weakly Supervised Reading Comprehension

no code implementations COLING 2020 Xuemiao Zhang, Kun Zhou, Sirui Wang, Fuzheng Zhang, Zhongyuan Wang, Junfei Liu

Weakly supervised machine reading comprehension (MRC) task is practical and promising for its easily available and massive training data, but inevitablely introduces noise.

Machine Reading Comprehension

Multilingual Dialogue Generation with Shared-Private Memory

no code implementations6 Oct 2019 Chen Chen, Lisong Qiu, Zhenxin Fu, Dongyan Zhao, Junfei Liu, Rui Yan

Existing dialog systems are all monolingual, where features shared among different languages are rarely explored.

Cross-Lingual Transfer Dialogue Generation

Cannot find the paper you are looking for? You can Submit a new open access paper.