Node Classification on Non-Homophilic (Heterophilic) Graphs
28 papers with code • 15 benchmarks • 15 datasets
There exists a non-trivial set of graphs where graph-aware models underperform their corresponding graph-agnostic models, e.g. SGC and GCN underperform MLP with 1 layer and 2 layers. Although still controversial, people believe the performance degradation results from heterophily, i.e. there exist much more inter-class edges than inner-class edges. This task aims to evaluate models designed for non-homophilic (heterophilic) datasets.
Libraries
Use these libraries to find Node Classification on Non-Homophilic (Heterophilic) Graphs models and implementationsDatasets
Most implemented papers
Large Scale Learning on Non-Homophilous Graphs: New Benchmarks and Strong Simple Methods
Many widely used datasets for graph machine learning tasks have generally been homophilous, where nodes with similar labels connect to each other.
Break the Ceiling: Stronger Multi-scale Deep Graph Convolutional Networks
Recently, neural network based approaches have achieved significant improvement for solving large, complex, graph-structured problems.
Non-Local Graph Neural Networks
Modern graph neural networks (GNNs) learn node embeddings through multilayer local aggregation and achieve great success in applications on assortative graphs.
Adaptive Universal Generalized PageRank Graph Neural Network
We address these issues by introducing a new Generalized PageRank (GPR) GNN architecture that adaptively learns the GPR weights so as to jointly optimize node feature and topological information extraction, regardless of the extent to which the node labels are homophilic or heterophilic.
Beyond Low-frequency Information in Graph Convolutional Networks
For a deeper understanding, we theoretically analyze the roles of low-frequency signals and high-frequency signals on learning node representations, which further explains why FAGCN can perform well on different types of networks.
Two Sides of the Same Coin: Heterophily and Oversmoothing in Graph Convolutional Neural Networks
We are the first to take a unified perspective to jointly explain the oversmoothing and heterophily problems at the node level.
New Benchmarks for Learning on Non-Homophilous Graphs
Much data with graph structures satisfy the principle of homophily, meaning that connected nodes tend to be similar with respect to a specific attribute.
Breaking the Limit of Graph Neural Networks by Improving the Assortativity of Graphs with Local Mixing Patterns
We find that the prediction performance of a wide range of GNN models is highly correlated with the node level assortativity.
BernNet: Learning Arbitrary Graph Spectral Filters via Bernstein Approximation
Many representative graph neural networks, e. g., GPR-GNN and ChebNet, approximate graph convolutions with graph spectral filters.
Deformable Graph Convolutional Networks
To address the two common problems of graph convolution, in this paper, we propose Deformable Graph Convolutional Networks (Deformable GCNs) that adaptively perform convolution in multiple latent spaces and capture short/long-range dependencies between nodes.