Towards Feature Overcorrelation in Deeper Graph Neural Networks

29 Sep 2021  ·  Wei Jin, Xiaorui Liu, Yao Ma, Charu Aggarwal, Jiliang Tang ·

Graph neural networks (GNNs) have achieved great success in graph representation learning, which has tremendously facilitated various real-world applications. Nevertheless, the performance of GNNs significantly deteriorates when the depth increases. Recent researches have attributed this phenomenon to the oversmoothing issue, which indicates that the learned node representations are highly indistinguishable. In this paper, we observe a new issue in deeper GNNs, i.e., feature overcorrelation, and perform a thorough study to deepen our understanding on this issue. In particular, we demonstrate the existence of feature overcorrelation in deeper GNNs, reveal potential reasons leading to this issue, and validate that overcorrelation and oversmoothing present different patterns though they are related. Since feature overcorrelation indicates that GNNs encode less information and can harm the downstream tasks, it is of great significance to mitigate this issue. Therefore, we propose the DeCorr, a general framework to effectively reduce feature correlation for deeper GNNs. Experimental results on various datasets demonstrate that DeCorr can help train deeper GNNs effectively and is complementary to methods tackling oversmoothing.

PDF Abstract

Datasets


Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here