Graph Classification by Mixture of Diverse Experts

29 Mar 2021  ·  Fenyu Hu, Liping Wang, Shu Wu, Liang Wang, Tieniu Tan ·

Graph classification is a challenging research problem in many applications across a broad range of domains. In these applications, it is very common that class distribution is imbalanced. Recently, Graph Neural Network (GNN) models have achieved superior performance on various real-world datasets. Despite their success, most of current GNN models largely overlook the important setting of imbalanced class distribution, which typically results in prediction bias towards majority classes. To alleviate the prediction bias, we propose to leverage semantic structure of dataset based on the distribution of node embedding. Specifically, we present GraphDIVE, a general framework leveraging mixture of diverse experts (i.e., graph classifiers) for imbalanced graph classification. With a divide-and-conquer principle, GraphDIVE employs a gating network to partition an imbalanced graph dataset into several subsets. Then each expert network is trained based on its corresponding subset. Experiments on real-world imbalanced graph datasets demonstrate the effectiveness of GraphDIVE.

PDF Abstract

Datasets


Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods