Simple and Deep Graph Convolutional Networks

Graph convolutional networks (GCNs) are a powerful deep learning approach for graph-structured data. Recently, GCNs and subsequent variants have shown superior performance in various application areas on real-world datasets. Despite their success, most of the current GCN models are shallow, due to the {\em over-smoothing} problem. In this paper, we study the problem of designing and analyzing deep graph convolutional networks. We propose the GCNII, an extension of the vanilla GCN model with two simple yet effective techniques: {\em Initial residual} and {\em Identity mapping}. We provide theoretical and empirical evidence that the two techniques effectively relieves the problem of over-smoothing. Our experiments show that the deep GCNII model outperforms the state-of-the-art methods on various semi- and full-supervised tasks. Code is available at https://github.com/chennnM/GCNII .

PDF Abstract ICML 2020 PDF
Task Dataset Model Metric Name Metric Value Global Rank Benchmark
Node Classification Citeseer Full-supervised GCNII* Accuracy 77.13% # 5
Node Classification CiteSeer with Public Split: fixed 20 nodes per class GCNII Accuracy 73.4% # 9
Node Classification Cora Full-supervised GCNII Accuracy 88.49% # 1
Node Classification Cora with Public Split: fixed 20 nodes per class GCNII Accuracy 85.5% # 1
Node Property Prediction ogbn-arxiv GCNII Test Accuracy 0.7274 ± 0.0016 # 35
Validation Accuracy Please tell us # 57
Number of params 2148648 # 7
Ext. data No # 1
Node Classification PPI GCNII* F1 99.56 # 1
Node Classification Pubmed Full-supervised GCNII* Accuracy 90.30% # 4
Node Classification PubMed with Public Split: fixed 20 nodes per class GCNII Accuracy 80.2% # 12

Methods