# Text Style Transfer

64 papers with code • 2 benchmarks • 4 datasets

Text Style Transfer is the task of controlling certain attributes of generated text. The state-of-the-art methods can be categorized into two main types which are used on parallel and non-parallel data. Methods on parallel data are typically supervised methods that use a neural sequence-to-sequence model with the encoder-decoder architecture. Methods on non-parallel data are usually unsupervised approaches using Disentanglement, Prototype Editing and Pseudo-Parallel Corpus Construction.

The popular benchmark for this task is the Yelp Review Dataset. Models are typically evaluated with the metrics of Sentiment Accuracy, BLEU, and PPL.

## Libraries

Use these libraries to find Text Style Transfer models and implementations
3 papers
189
2 papers
1,455
2 papers
71

# Style Transfer from Non-Parallel Text by Cross-Alignment

We demonstrate the effectiveness of this cross-alignment method on three tasks: sentiment modification, decipherment of word substitution ciphers, and recovery of word order.

12

# A Probabilistic Formulation of Unsupervised Text Style Transfer

Across all style transfer tasks, our approach yields substantial gains over state-of-the-art non-generative baselines, including the state-of-the-art unsupervised machine translation techniques that our approach generalizes.

5

# Style Transformer: Unpaired Text Style Transfer without Disentangled Latent Representation

Disentangling the content and style in the latent space is prevalent in unpaired text style transfer.

4

# Style Transfer Through Back-Translation

We first learn a latent representation of the input sentence which is grounded in a language translation model in order to better preserve the meaning of the sentence while reducing stylistic properties.

3

# Disentangled Representation Learning for Non-Parallel Text Style Transfer

This paper tackles the problem of disentangling the latent variables of style and content in language models.

3

# Multiple-Attribute Text Style Transfer

1 Nov 2018

The dominant approach to unsupervised "style transfer" in text is based on the idea of learning a latent representation, which is independent of the attributes specifying its "style".

3

# IMaT: Unsupervised Text Attribute Transfer via Iterative Matching and Translation

Text attribute transfer aims to automatically rewrite sentences such that they possess certain linguistic attributes, while simultaneously preserving their semantic content.

3

# Educating Text Autoencoders: Latent Representation Guidance via Denoising

We prove that this simple modification guides the latent space geometry of the resulting model by encouraging the encoder to map similar texts to similar latent representations.

3

# Beyond Fully-Connected Layers with Quaternions: Parameterization of Hypercomplex Multiplications with $1/n$ Parameters

Recent works have demonstrated reasonable success of representation learning in hypercomplex space.

3

# Style Transfer in Text: Exploration and Evaluation

18 Nov 2017

Results show that the proposed content preservation metric is highly correlate to human judgments, and the proposed models are able to generate sentences with higher style transfer strength and similar content preservation score comparing to auto-encoder.

2