When Contrastive Learning Meets Active Learning: A Novel Graph Active Learning Paradigm with Self-Supervision

30 Oct 2020  ·  Yanqiao Zhu, Weizhi Xu, Qiang Liu, Shu Wu ·

This paper studies active learning (AL) on graphs, whose purpose is to discover the most informative nodes to maximize the performance of graph neural networks (GNNs). Previously, most graph AL methods focus on learning node representations from a carefully selected labeled dataset with large amount of unlabeled data neglected. Motivated by the success of contrastive learning (CL), we propose a novel paradigm that seamlessly integrates graph AL with CL. While being able to leverage the power of abundant unlabeled data in a self-supervised manner, nodes selected by AL further provide semantic information that can better guide representation learning. Besides, previous work measures the informativeness of nodes without considering the neighborhood propagation scheme of GNNs, so that noisy nodes may be selected. We argue that due to the smoothing nature of GNNs, the central nodes from homophilous subgraphs should benefit the model training most. To this end, we present a minimax selection scheme that explicitly harnesses neighborhood information and discover homophilous subgraphs to facilitate active selection. Comprehensive, confounding-free experiments on five public datasets demonstrate the superiority of our method over state-of-the-arts.

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods