Paraphrase Generation Models

BTmPG, or Back-Translation guided multi-round Paraphrase Generation, is a multi-round paraphrase generation method that leverages back-translation to guide paraphrase model during training and generates paraphrases in a multiround process. The model regards paraphrase generation as a monolingual translation task. Given a paraphrase pair $\left(S_{0}, P\right)$, which $S_{0}$ is the original/source sentence and $P$ is the target paraphrase given in the dataset. In the first round generation, we send $S_{0}$ into a paraphrase model to generate a paraphrase $S_{1}$. In the second round generation, we use the $S_{1}$ as the input of the model to generate a new paraphrase $S_{2}$. And so forth, in the $i$-th round generation, we send $S_{i−1}$ into the paraphrase model to generate $S_{i}$. .

Source: Pushing Paraphrase Away from Original Sentence: A Multi-Round Paraphrase Generation Approach


Paper Code Results Date Stars


Task Papers Share
Paraphrase Generation 1 33.33%
Sentence 1 33.33%
Translation 1 33.33%


Component Type
🤖 No Components Found You can add them if they exist; e.g. Mask R-CNN uses RoIAlign