Every Node Counts: Self-Ensembling Graph Convolutional Networks for Semi-Supervised Learning

26 Sep 2018  ·  Yawei Luo, Tao Guan, Junqing Yu, Ping Liu, Yi Yang ·

Graph convolutional network (GCN) provides a powerful means for graph-based semi-supervised tasks. However, as a localized first-order approximation of spectral graph convolution, the classic GCN can not take full advantage of unlabeled data, especially when the unlabeled node is far from labeled ones. To capitalize on the information from unlabeled nodes to boost the training for GCN, we propose a novel framework named Self-Ensembling GCN (SEGCN), which marries GCN with Mean Teacher - another powerful model in semi-supervised learning. SEGCN contains a student model and a teacher model. As a student, it not only learns to correctly classify the labeled nodes, but also tries to be consistent with the teacher on unlabeled nodes in more challenging situations, such as a high dropout rate and graph collapse. As a teacher, it averages the student model weights and generates more accurate predictions to lead the student. In such a mutual-promoting process, both labeled and unlabeled samples can be fully utilized for backpropagating effective gradients to train GCN. In three article classification tasks, i.e. Citeseer, Cora and Pubmed, we validate that the proposed method matches the state of the arts in the classification accuracy.

PDF Abstract
Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Node Classification CiteSeer with Public Split: fixed 20 nodes per class SEGCN Accuracy 73.4 ± 0.7 # 14
Node Classification Cora SEGCN Accuracy 83.5% ± 0.4% # 39
Node Classification Cora: fixed 20 node per class SEGCN Accuracy 83.5 ± 0.4 # 4
Node Classification PubMed with Public Split: fixed 20 nodes per class SEGCN Accuracy 78.9 ± 0.7 # 23

Methods