AdaGCN: Adaboosting Graph Convolutional Networks into Deep Models

ICLR 2021  ·  Ke Sun, Zhanxing Zhu, Zhouchen Lin ·

The design of deep graph models still remains to be investigated and the crucial part is how to explore and exploit the knowledge from different hops of neighbors in an efficient way. In this paper, we propose a novel RNN-like deep graph neural network architecture by incorporating AdaBoost into the computation of network; and the proposed graph convolutional network called AdaGCN~(Adaboosting Graph Convolutional Network) has the ability to efficiently extract knowledge from high-order neighbors of current nodes and then integrates knowledge from different hops of neighbors into the network in an Adaboost way. Different from other graph neural networks that directly stack many graph convolution layers, AdaGCN shares the same base neural network architecture among all ``layers'' and is recursively optimized, which is similar to an RNN. Besides, We also theoretically established the connection between AdaGCN and existing graph convolutional methods, presenting the benefits of our proposal. Finally, extensive experiments demonstrate the consistent state-of-the-art prediction performance on graphs across different label rates and the computational advantage of our approach AdaGCN~\footnote{Code is available at \url{https://github.com/datake/AdaGCN}.}

PDF Abstract ICLR 2021 PDF ICLR 2021 Abstract

Results from the Paper


Task Dataset Model Metric Name Metric Value Global Rank Uses Extra
Training Data
Result Benchmark
Node Classification Citeseer AdaGCN Accuracy 76.22 ± 0.20 # 16
Node Classification Cora AdaGCN Accuracy 85.46% ± 0.25% # 24
Node Classification MS ACADEMIC APPNP (AdaGCN authors) Accuracy 92.98 ± 0.07 # 2
Node Classification MS ACADEMIC AdaGCN Accuracy 92.87 ± 0.07 # 3
Node Classification Pubmed AdaGCN Accuracy 79.76 ± 0.27 # 37

Methods