Node Classification on Non-Homophilic (Heterophilic) Graphs

28 papers with code • 15 benchmarks • 15 datasets

There exists a non-trivial set of graphs where graph-aware models underperform their corresponding graph-agnostic models, e.g. SGC and GCN underperform MLP with 1 layer and 2 layers. Although still controversial, people believe the performance degradation results from heterophily, i.e. there exist much more inter-class edges than inner-class edges. This task aims to evaluate models designed for non-homophilic (heterophilic) datasets.

Libraries

Use these libraries to find Node Classification on Non-Homophilic (Heterophilic) Graphs models and implementations

Most implemented papers

Large Scale Learning on Non-Homophilous Graphs: New Benchmarks and Strong Simple Methods

cuai/non-homophily-large-scale NeurIPS 2021

Many widely used datasets for graph machine learning tasks have generally been homophilous, where nodes with similar labels connect to each other.

Break the Ceiling: Stronger Multi-scale Deep Graph Convolutional Networks

PwnerHarry/Stronger_GCN NeurIPS 2019

Recently, neural network based approaches have achieved significant improvement for solving large, complex, graph-structured problems.

Non-Local Graph Neural Networks

divelab/Non-Local-GNN 29 May 2020

Modern graph neural networks (GNNs) learn node embeddings through multilayer local aggregation and achieve great success in applications on assortative graphs.

Adaptive Universal Generalized PageRank Graph Neural Network

jianhao2016/GPRGNN ICLR 2021

We address these issues by introducing a new Generalized PageRank (GPR) GNN architecture that adaptively learns the GPR weights so as to jointly optimize node feature and topological information extraction, regardless of the extent to which the node labels are homophilic or heterophilic.

Beyond Low-frequency Information in Graph Convolutional Networks

bdy9527/FAGCN 4 Jan 2021

For a deeper understanding, we theoretically analyze the roles of low-frequency signals and high-frequency signals on learning node representations, which further explains why FAGCN can perform well on different types of networks.

Two Sides of the Same Coin: Heterophily and Oversmoothing in Graph Convolutional Neural Networks

yujun-yan/heterophily_and_oversmoothing 12 Feb 2021

We are the first to take a unified perspective to jointly explain the oversmoothing and heterophily problems at the node level.

New Benchmarks for Learning on Non-Homophilous Graphs

CUAI/Non-Homophily-Benchmarks 3 Apr 2021

Much data with graph structures satisfy the principle of homophily, meaning that connected nodes tend to be similar with respect to a specific attribute.

Breaking the Limit of Graph Neural Networks by Improving the Assortativity of Graphs with Local Mixing Patterns

susheels/gnns-and-local-assortativity 11 Jun 2021

We find that the prediction performance of a wide range of GNN models is highly correlated with the node level assortativity.

BernNet: Learning Arbitrary Graph Spectral Filters via Bernstein Approximation

ivam-he/BernNet NeurIPS 2021

Many representative graph neural networks, e. g., GPR-GNN and ChebNet, approximate graph convolutions with graph spectral filters.

Deformable Graph Convolutional Networks

mlvlab/DeformableGCN 29 Dec 2021

To address the two common problems of graph convolution, in this paper, we propose Deformable Graph Convolutional Networks (Deformable GCNs) that adaptively perform convolution in multiple latent spaces and capture short/long-range dependencies between nodes.