Paraphrase Generation

50 papers with code • 3 benchmarks • 12 datasets

Paraphrase Generation involves transforming a natural language sentence to a new sentence, that has the same semantic meaning but a different syntactic or lexical surface form.

Most implemented papers

Learning Semantic Sentence Embeddings using Sequential Pair-wise Discriminator

dev-chauhan/PQG-pytorch COLING 2018

One way to ensure this is by adding constraints for true paraphrase embeddings to be close and unrelated paraphrase candidate sentence embeddings to be far.

Paraphrase Generation with Latent Bag of Words

FranxYao/Deep-Generative-Models-for-Natural-Language-Processing NeurIPS 2019

Inspired by variational autoencoders with discrete latent structures, in this work, we propose a latent bag of words (BOW) model for paraphrase generation.

Neural Syntactic Preordering for Controlled Paraphrase Generation

tagoyal/sow-reap-paraphrasing ACL 2020

Paraphrasing natural language sentences is a multifaceted process: it might involve replacing individual words or short phrases, local rearrangement of content, or high-level restructuring like topicalization or passivization.

Syntax-guided Controlled Generation of Paraphrases

malllabiisc/SGCP TACL 2020

In these methods, syntactic-guidance is sourced from a separate exemplar sentence.

Neural Paraphrase Generation with Stacked Residual LSTM Networks

pushpendughosh/Stock-market-forecasting COLING 2016

To the best of our knowledge, this work is the first to explore deep learning models for paraphrase generation.

A Deep Generative Framework for Paraphrase Generation

arvind385801/paraphrasegen 15 Sep 2017

In this paper, we address the problem of generating paraphrases automatically.

Query and Output: Generating Words by Querying Distributed Word Representations for Paraphrase Generation

lancopku/WEAN NAACL 2018

The existing sequence-to-sequence model tends to memorize the words and the patterns in the training dataset instead of learning the meaning of the words.

Discriminating between Lexico-Semantic Relations with the Specialization Tensor Model

codogogo/stm NAACL 2018

We present a simple and effective feed-forward neural architecture for discriminating between lexico-semantic relations (synonymy, antonymy, hypernymy, and meronymy).

Learning Semantic Sentence Embeddings using Sequential Pair-wise Discriminator

badripatro/PQG COLING 2018

One way to ensure this is by adding constraints for true paraphrase embeddings to be close and unrelated paraphrase candidate sentence embeddings to be far.