Measuring and Relieving the Over-smoothing Problem for Graph Neural Networks from the Topological View

7 Sep 2019  ·  Deli Chen, Yankai Lin, Wei Li, Peng Li, Jie zhou, Xu sun ·

Graph Neural Networks (GNNs) have achieved promising performance on a wide range of graph-based tasks. Despite their success, one severe limitation of GNNs is the over-smoothing issue (indistinguishable representations of nodes in different classes). In this work, we present a systematic and quantitative study on the over-smoothing issue of GNNs. First, we introduce two quantitative metrics, MAD and MADGap, to measure the smoothness and over-smoothness of the graph nodes representations, respectively. Then, we verify that smoothing is the nature of GNNs and the critical factor leading to over-smoothness is the low information-to-noise ratio of the message received by the nodes, which is partially determined by the graph topology. Finally, we propose two methods to alleviate the over-smoothing issue from the topological view: (1) MADReg which adds a MADGap-based regularizer to the training objective;(2) AdaGraph which optimizes the graph topology based on the model predictions. Extensive experiments on 7 widely-used graph datasets with 10 typical GNN models show that the two proposed methods are effective for relieving the over-smoothing issue, thus improving the performance of various GNN models.

PDF Abstract
No code implementations yet. Submit your code now

Results from the Paper


Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Node Classification Citeseer GCN + AdaGraph (AG) Accuracy 69.7% # 63
Node Classification Cora GCN + AdaGraph (AG) Accuracy 82.3% # 52
Node Classification Pubmed GCN + AdaGraph (AG) Accuracy 77.4 ± 0.2 # 57

Methods


No methods listed for this paper. Add relevant methods here