Search Results for author: Lihua Qian

Found 11 papers, 6 papers with code

Diffusion Language Models Can Perform Many Tasks with Scaling and Instruction-Finetuning

1 code implementation23 Aug 2023 Jiasheng Ye, Zaixiang Zheng, Yu Bao, Lihua Qian, Quanquan Gu

We then reprogram pretrained masked language models into diffusion language models via diffusive adaptation, wherein task-specific finetuning and instruction finetuning are explored to unlock their versatility in solving general language tasks.

In-Context Learning Language Modelling +1

$\textit{latent}$-GLAT: Glancing at Latent Variables for Parallel Text Generation

1 code implementation5 Apr 2022 Yu Bao, Hao Zhou, ShuJian Huang, Dongqi Wang, Lihua Qian, Xinyu Dai, Jiajun Chen, Lei LI

Recently, parallel text generation has received widespread attention due to its success in generation efficiency.

Text Generation

GLAT: Glancing at Latent Variables for Parallel Text Generation

1 code implementation ACL 2022 Yu Bao, Hao Zhou, ShuJian Huang, Dongqi Wang, Lihua Qian, Xinyu Dai, Jiajun Chen, Lei LI

Recently, parallel text generation has received widespread attention due to its success in generation efficiency.

Text Generation

QA4IE: A Question Answering based Framework for Information Extraction

1 code implementation10 Apr 2018 Lin Qiu, Hao Zhou, Yanru Qu, Wei-Nan Zhang, Suoheng Li, Shu Rong, Dongyu Ru, Lihua Qian, Kewei Tu, Yong Yu

Information Extraction (IE) refers to automatically extracting structured relation tuples from unstructured texts.

Question Answering Relation +2

DINOISER: Diffused Conditional Sequence Learning by Manipulating Noises

1 code implementation20 Feb 2023 Jiasheng Ye, Zaixiang Zheng, Yu Bao, Lihua Qian, Mingxuan Wang

In this paper, we introduce DINOISER to facilitate diffusion models for sequence generation by manipulating noises.

Exploring Diverse Expressions for Paraphrase Generation

no code implementations IJCNLP 2019 Lihua Qian, Lin Qiu, Wei-Nan Zhang, Xin Jiang, Yong Yu

Paraphrasing plays an important role in various natural language processing (NLP) tasks, such as question answering, information retrieval and sentence simplification.

Information Retrieval Paraphrase Generation +4

Non-iterative Parallel Text Generation via Glancing Transformer

no code implementations1 Jan 2021 Lihua Qian, Hao Zhou, Yu Bao, Mingxuan Wang, Lin Qiu, Weinan Zhang, Yong Yu, Lei LI

Although non-autoregressive models with one-iteration generation achieves remarkable inference speed-up, they still falls behind their autoregressive counterparts inprediction accuracy.

Language Modelling Text Generation

Diffusion Glancing Transformer for Parallel Sequence to Sequence Learning

no code implementations20 Dec 2022 Lihua Qian, Mingxuan Wang, Yang Liu, Hao Zhou

Previously, non-autoregressive models were widely perceived as being superior in generation efficiency but inferior in generation quality due to the difficulties of modeling multiple target modalities.

Knowledge Distillation Machine Translation +1

Cannot find the paper you are looking for? You can Submit a new open access paper.