Semi-Supervised Formality Style Transfer with Consistency Training

ACL 2022  ·  Ao Liu, An Wang, Naoaki Okazaki ·

Formality style transfer (FST) is a task that involves paraphrasing an informal sentence into a formal one without altering its meaning. To address the data-scarcity problem of existing parallel datasets, previous studies tend to adopt a cycle-reconstruction scheme to utilize additional unlabeled data, where the FST model mainly benefits from target-side unlabeled sentences. In this work, we propose a simple yet effective semi-supervised framework to better utilize source-side unlabeled sentences based on consistency training. Specifically, our approach augments pseudo-parallel data obtained from a source-side informal sentence by enforcing the model to generate similar outputs for its perturbed version. Moreover, we empirically examined the effects of various data perturbation methods and propose effective data filtering strategies to improve our framework. Experimental results on the GYAFC benchmark demonstrate that our approach can achieve state-of-the-art results, even with less than 40% of the parallel data.

PDF Abstract ACL 2022 PDF ACL 2022 Abstract

Datasets


Results from the Paper


 Ranked #1 on Formality Style Transfer on GYAFC (using extra training data)

     Get a GitHub badge
Task Dataset Model Metric Name Metric Value Global Rank Uses Extra
Training Data
Benchmark
Formality Style Transfer GYAFC Consistency Training BLEU 81.37 # 1

Methods


No methods listed for this paper. Add relevant methods here