Dual Contrastive Learning: Text Classification via Label-Aware Data Augmentation

21 Jan 2022  ·  Qianben Chen, Richong Zhang, Yaowei Zheng, Yongyi Mao ·

Contrastive learning has achieved remarkable success in representation learning via self-supervision in unsupervised settings. However, effectively adapting contrastive learning to supervised learning tasks remains as a challenge in practice. In this work, we introduce a dual contrastive learning (DualCL) framework that simultaneously learns the features of input samples and the parameters of classifiers in the same space. Specifically, DualCL regards the parameters of the classifiers as augmented samples associating to different labels and then exploits the contrastive learning between the input samples and the augmented samples. Empirical studies on five benchmark text classification datasets and their low-resource version demonstrate the improvement in classification accuracy and confirm the capability of learning discriminative representations of DualCL.

PDF Abstract

Results from the Paper


Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Sentiment Analysis SST-2 Binary classification RoBERTa+DualCL Accuracy 94.91 # 26
Subjectivity Analysis SUBJ RoBERTa+DualCL Accuracy 97.34 # 1
Text Classification TREC-6 RoBERTa+DualCL Error 2.60 # 3

Methods