1 code implementation • ACL 2022 • Yu Bao, Hao Zhou, ShuJian Huang, Dongqi Wang, Lihua Qian, Xinyu Dai, Jiajun Chen, Lei LI
Recently, parallel text generation has received widespread attention due to its success in generation efficiency.
1 code implementation • 23 Aug 2023 • Jiasheng Ye, Zaixiang Zheng, Yu Bao, Lihua Qian, Quanquan Gu
We then reprogram pretrained masked language models into diffusion language models via diffusive adaptation, wherein task-specific finetuning and instruction finetuning are explored to unlock their versatility in solving general language tasks.
1 code implementation • 20 Feb 2023 • Jiasheng Ye, Zaixiang Zheng, Yu Bao, Lihua Qian, Mingxuan Wang
In this paper, we introduce DINOISER to facilitate diffusion models for sequence generation by manipulating noises.
no code implementations • 20 Dec 2022 • Lihua Qian, Mingxuan Wang, Yang Liu, Hao Zhou
Previously, non-autoregressive models were widely perceived as being superior in generation efficiency but inferior in generation quality due to the difficulties of modeling multiple target modalities.
1 code implementation • 5 Apr 2022 • Yu Bao, Hao Zhou, ShuJian Huang, Dongqi Wang, Lihua Qian, Xinyu Dai, Jiajun Chen, Lei LI
Recently, parallel text generation has received widespread attention due to its success in generation efficiency.
no code implementations • ICLR 2022 • Zhenqiao Song, Hao Zhou, Lihua Qian, Jingjing Xu, Shanbo Cheng, Mingxuan Wang, Lei LI
Multilingual machine translation aims to develop a single model for multiple language directions.
no code implementations • WMT (EMNLP) 2021 • Lihua Qian, Yi Zhou, Zaixiang Zheng, Yaoming Zhu, Zehui Lin, Jiangtao Feng, Shanbo Cheng, Lei LI, Mingxuan Wang, Hao Zhou
This paper describes the Volctrans' submission to the WMT21 news translation shared task for German->English translation.
no code implementations • 1 Jan 2021 • Lihua Qian, Hao Zhou, Yu Bao, Mingxuan Wang, Lin Qiu, Weinan Zhang, Yong Yu, Lei LI
Although non-autoregressive models with one-iteration generation achieves remarkable inference speed-up, they still falls behind their autoregressive counterparts inprediction accuracy.
2 code implementations • ACL 2021 • Lihua Qian, Hao Zhou, Yu Bao, Mingxuan Wang, Lin Qiu, Wei-Nan Zhang, Yong Yu, Lei LI
With GLM, we develop Glancing Transformer (GLAT) for machine translation.
Ranked #68 on Machine Translation on WMT2014 English-German
no code implementations • IJCNLP 2019 • Lihua Qian, Lin Qiu, Wei-Nan Zhang, Xin Jiang, Yong Yu
Paraphrasing plays an important role in various natural language processing (NLP) tasks, such as question answering, information retrieval and sentence simplification.
1 code implementation • 10 Apr 2018 • Lin Qiu, Hao Zhou, Yanru Qu, Wei-Nan Zhang, Suoheng Li, Shu Rong, Dongyu Ru, Lihua Qian, Kewei Tu, Yong Yu
Information Extraction (IE) refers to automatically extracting structured relation tuples from unstructured texts.