BertGCN: Transductive Text Classification by Combining GCN and BERT

12 May 2021  ·  Yuxiao Lin, Yuxian Meng, Xiaofei Sun, Qinghong Han, Kun Kuang, Jiwei Li, Fei Wu ·

In this work, we propose BertGCN, a model that combines large scale pretraining and transductive learning for text classification. BertGCN constructs a heterogeneous graph over the dataset and represents documents as nodes using BERT representations. By jointly training the BERT and GCN modules within BertGCN, the proposed model is able to leverage the advantages of both worlds: large-scale pretraining which takes the advantage of the massive amount of raw data and transductive learning which jointly learns representations for both training data and unlabeled test data by propagating label influence through graph convolution. Experiments show that BertGCN achieves SOTA performances on a wide range of text classification datasets. Code is available at https://github.com/ZeroRin/BertGCN.

PDF Abstract

Results from the Paper


Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Text Classification 20NEWS RoBERTaGCN Accuracy 89.5 # 2
Text Classification 20 Newsgroups RoBERTaGCN Accuracy 89.5 # 1
Text Classification MR RoBERTaGCN Accuracy 89.7 # 3
Text Classification Ohsumed RoBERTaGCN Accuracy 72.8 # 1
Text Classification R52 1-6 BertGCN Accuracy 96.6 # 1
Text Classification R8 RoBERTaGCN Accuracy 98.2 # 4

Methods