Paraphrase Generation

68 papers with code • 3 benchmarks • 16 datasets

Paraphrase Generation involves transforming a natural language sentence to a new sentence, that has the same semantic meaning but a different syntactic or lexical surface form.

Most implemented papers

Exemplar-Controllable Paraphrasing and Translation using Bitext

mingdachen/mVGVAE 12 Oct 2020

Our experimental results show that our models achieve competitive results on controlled paraphrase generation and strong performance on controlled machine translation.

Reformulating Unsupervised Style Transfer as Paraphrase Generation

martiansideofthemoon/style-transfer-paraphrase EMNLP 2020

Modern NLP defines the task of style transfer as modifying the style of a given sentence without appreciably changing its semantics, which implies that the outputs of style transfer systems should be paraphrases of their inputs.

Sound Natural: Content Rephrasing in Dialog Systems

facebook/content_rephrasing EMNLP 2020

We introduce a new task of rephrasing for a more natural virtual assistant.

Latent Template Induction with Gumbel-CRFs

FranxYao/Gumbel-CRF NeurIPS 2020

Learning to control the structure of sentences is a challenging problem in text generation.

ParaSCI: A Large Scientific Paraphrase Dataset for Longer Paraphrase Generation

dqxiu/ParaSCI EACL 2021

We propose ParaSCI, the first large-scale paraphrase dataset in the scientific field, including 33, 981 paraphrase pairs from ACL (ParaSCI-ACL) and 316, 063 pairs from arXiv (ParaSCI-arXiv).

Generating Syntactically Controlled Paraphrases without Using Annotated Parallel Pairs

uclanlp/synpg EACL 2021

We also demonstrate that the performance of SynPG is competitive or even better than supervised models when the unannotated data is large.

Factorising Meaning and Form for Intent-Preserving Paraphrasing

tomhosking/separator ACL 2021

We propose a method for generating paraphrases of English questions that retain the original intent but use a different surface form.

Neural semi-Markov CRF for Monolingual Word Alignment

chaojiang06/neural-Jacana ACL 2021

Monolingual word alignment is important for studying fine-grained editing operations (i. e., deletion, addition, and substitution) in text-to-text generation tasks, such as paraphrase generation, text simplification, neutralizing biased language, etc.

Contrastive Representation Learning for Exemplar-Guided Paraphrase Generation

lhryang/crl_egpg Findings (EMNLP) 2021

Exemplar-Guided Paraphrase Generation (EGPG) aims to generate a target sentence which conforms to the style of the given exemplar while encapsulating the content information of the source sentence.

Pushing Paraphrase Away from Original Sentence: A Multi-Round Paraphrase Generation Approach

l-zhe/btmpg Findings (ACL) 2021

Both automatic and human evaluation show BTmPG can improve the diversity of paraphrase while preserving the semantics of the original sentence.