Unsupervised Text Style Transfer

21 papers with code • 3 benchmarks • 3 datasets

This task has no description! Would you like to contribute one?

Latest papers with no code

Unsupervised Text Style Transfer via LLMs and Attention Masking with Multi-way Interactions

no code yet • 21 Feb 2024

Among existing methods for UTST tasks, attention masking approach and Large Language Models (LLMs) are deemed as two pioneering methods.

Prefix-Tuning Based Unsupervised Text Style Transfer

no code yet • 23 Oct 2023

Unsupervised text style transfer aims at training a generative model that can alter the style of the input sentence while preserving its content without using any parallel data.

Unsupervised Text Style Transfer with Deep Generative Models

no code yet • 31 Aug 2023

We present a general framework for unsupervised text style transfer with deep generative models.

StyleFlow: Disentangle Latent Representations via Normalizing Flow for Unsupervised Text Style Transfer

no code yet • 19 Dec 2022

Since cycle construction helps to improve the style transfer ability of the model by rebuilding transferred sentences back to original-style sentences, it brings about a content loss in unsupervised text style transfer tasks.

Low Resource Style Transfer via Domain Adaptive Meta Learning

no code yet • NAACL 2022

Text style transfer (TST) without parallel data has achieved some practical success.

Efficient Reinforcement Learning for Unsupervised Controlled Text Generation

no code yet • 16 Apr 2022

A major challenge in applying RL to such tasks is the sparse reward, which is available only after the full text is generated.

Low Resource Style Transfer via Domain Adaptive Meta Learning

no code yet • ACL ARR January 2022

Text style transfer (TST) without parallel data has achieved some practical success.

Don't Take It Literally: An Edit-Invariant Sequence Loss for Text Generation

no code yet • ACL ARR January 2022

Such training objective is sub-optimal when the target sequence is not perfect, e. g., when the target sequence is corrupted with noises, or when only weak sequence supervision is available.

DAML-ST5: Low Resource Style Transfer via Domain Adaptive Meta Learning

no code yet • ACL ARR November 2021

Moreover, we propose a new unsupervised TST model Style-T5 (ST5), which is composed of a sequence-to-sequence pre-trained language model T5 and uses style adversarial training for better content preservation and style transfer.