Universal Deep GNNs: Rethinking Residual Connection in GNNs from a Path Decomposition Perspective for Preventing the Over-smoothing

30 May 2022  ·  Jie Chen, Weiqi Liu, Zhizhong Huang, Junbin Gao, Junping Zhang, Jian Pu ·

The performance of GNNs degrades as they become deeper due to the over-smoothing. Among all the attempts to prevent over-smoothing, residual connection is one of the promising methods due to its simplicity. However, recent studies have shown that GNNs with residual connections only slightly slow down the degeneration. The reason why residual connections fail in GNNs is still unknown. In this paper, we investigate the forward and backward behavior of GNNs with residual connections from a novel path decomposition perspective. We find that the recursive aggregation of the median length paths from the binomial distribution of residual connection paths dominates output representation, resulting in over-smoothing as GNNs go deeper. Entangled propagation and weight matrices cause gradient smoothing and prevent GNNs with residual connections from optimizing to the identity mapping. Based on these findings, we present a Universal Deep GNNs (UDGNN) framework with cold-start adaptive residual connections (DRIVE) and feedforward modules. Extensive experiments demonstrate the effectiveness of our method, which achieves state-of-the-art results over non-smooth heterophily datasets by simply stacking standard GNNs.

PDF Abstract
No code implementations yet. Submit your code now

Results from the Paper


Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Node Classification Actor UDGNN (GCN) Accuracy 36.13 ± 1.21 # 29
Node Classification Chameleon UDGNN (GCN) Accuracy 74.53±1.19 # 12
Node Classification Cornell UDGNN (GCN) Accuracy 84.32±7.29 # 23
Node Classification Squirrel UDGNN (GCN) Accuracy 68.13±2.59 # 9
Node Classification Texas UDGNN (GCN) Accuracy 84.60±5.32 # 28
Node Classification Wisconsin UDGNN (GCN) Accuracy 87.64±3.74 # 21

Methods