Unsupervised Text Style Transfer
21 papers with code • 3 benchmarks • 2 datasets
We consider the task of text attribute transfer: transforming a sentence to alter a specific attribute (e. g., sentiment) while preserving its attribute-independent content (e. g., changing "screen is just the right size" to "screen is too small").
Across all style transfer tasks, our approach yields substantial gains over state-of-the-art non-generative baselines, including the state-of-the-art unsupervised machine translation techniques that our approach generalizes.
Therefore, in this paper, we propose a dual reinforcement learning framework to directly transfer the style of the text via a one-step mapping model, without any separation of content and style.
Binary classifiers are often employed as discriminators in GAN-based unsupervised style transfer systems to ensure that transferred sentences are similar to sentences in the target domain.
We propose a new framework that utilizes the gradients to revise the sentence in a continuous space during inference to achieve text style transfer.
Unsupervised text style transfer aims to alter text styles while preserving the content, without aligned data for supervision.
Semi-supervised Formality Style Transfer using Language Model Discriminator and Mutual Information Maximization
Formality style transfer is the task of converting informal sentences to grammatically-correct formal sentences, which can be used to improve performance of many downstream NLP tasks.
Moreover, compared to previous methods on unsupervised data synthesis, our method results in higher quality parallel style pairs and improves model performance.