FDGATII : Fast Dynamic Graph Attention with Initial Residual and Identity Mapping

21 Oct 2021  ·  Gayan K. Kulatilleke, Marius Portmann, Ryan Ko, Shekhar S. Chandra ·

While Graph Neural Networks have gained popularity in multiple domains, graph-structured input remains a major challenge due to (a) over-smoothing, (b) noisy neighbours (heterophily), and (c) the suspended animation problem. To address all these problems simultaneously, we propose a novel graph neural network FDGATII, inspired by attention mechanism's ability to focus on selective information supplemented with two feature preserving mechanisms. FDGATII combines Initial Residuals and Identity Mapping with the more expressive dynamic self-attention to handle noise prevalent from the neighbourhoods in heterophilic data sets. By using sparse dynamic attention, FDGATII is inherently parallelizable in design, whist efficient in operation; thus theoretically able to scale to arbitrary graphs with ease. Our approach has been extensively evaluated on 7 datasets. We show that FDGATII outperforms GAT and GCN based benchmarks in accuracy and performance on fully supervised tasks, obtaining state-of-the-art results on Chameleon and Cornell datasets with zero domain-specific graph pre-processing, and demonstrate its versatility and fairness.

PDF Abstract
Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Node Classification Chameleon FDGATII Accuracy 65.1754 # 41
Node Classification Citeseer Full-supervised FDGATII Accuracy 75.6434% # 5
Node Classification Cora Full-supervised FDGATII Accuracy 87.7867% # 3
Node Classification Cornell FDGATII Accuracy 82.4324 # 30
Node Classification Pubmed Full-supervised FDGATII Accuracy 90.3524% # 3
Node Classification Texas FDGATII Accuracy 80.5405 # 42
Node Classification Wisconsin FDGATII Accuracy 86.2745 # 32

Methods