Style Transfer in Text: Exploration and Evaluation

18 Nov 2017  ·  Zhenxin Fu, Xiaoye Tan, Nanyun Peng, Dongyan Zhao, Rui Yan ·

Style transfer is an important problem in natural language processing (NLP). However, the progress in language style transfer is lagged behind other domains, such as computer vision, mainly because of the lack of parallel data and principle evaluation metrics. In this paper, we propose to learn style transfer with non-parallel data. We explore two models to achieve this goal, and the key idea behind the proposed models is to learn separate content representations and style representations using adversarial networks. We also propose novel evaluation metrics which measure two aspects of style transfer: transfer strength and content preservation. We access our models and the evaluation metrics on two tasks: paper-news title transfer, and positive-negative review transfer. Results show that the proposed content preservation metric is highly correlate to human judgments, and the proposed models are able to generate sentences with higher style transfer strength and similar content preservation score comparing to auto-encoder.

PDF Abstract

Datasets


Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Text Style Transfer Yelp Review Dataset (Small) StyleEmbedding G-Score (BLEU, Accuracy) 31.31 # 8
Text Style Transfer Yelp Review Dataset (Small) MultiDecoder G-Score (BLEU, Accuracy) 45.02 # 6

Results from Other Papers


Task Dataset Model Metric Name Metric Value Rank Source Paper Compare
Unsupervised Text Style Transfer GYAFC StyleEmbed [[Fu et al.2018]] BLEU 7.9 # 7
Unsupervised Text Style Transfer GYAFC MultiDec [[Fu et al.2018]] BLEU 12.3 # 6
Unsupervised Text Style Transfer Yelp StyleEmbed [[Fu et al.2018]] BLEU 42.3 # 5
Unsupervised Text Style Transfer Yelp MultiDec [[Fu et al.2018]] BLEU 27.9 # 8

Methods


No methods listed for this paper. Add relevant methods here