Variational Graph Recurrent Neural Networks

Representation learning over graph structured data has been mostly studied in static graph settings while efforts for modeling dynamic graphs are still scant. In this paper, we develop a novel hierarchical variational model that introduces additional latent random variables to jointly model the hidden states of a graph recurrent neural network (GRNN) to capture both topology and node attribute changes in dynamic graphs. We argue that the use of high-level latent random variables in this variational GRNN (VGRNN) can better capture potential variability observed in dynamic graphs as well as the uncertainty of node latent representation. With semi-implicit variational inference developed for this new VGRNN architecture (SI-VGRNN), we show that flexible non-Gaussian latent representations can further help dynamic graph analytic tasks. Our experiments with multiple real-world dynamic graph datasets demonstrate that SI-VGRNN and VGRNN consistently outperform the existing baseline and state-of-the-art methods by a significant margin in dynamic link prediction.

PDF Abstract NeurIPS 2019 PDF NeurIPS 2019 Abstract
Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Dynamic Link Prediction DBLP Temporal VGRNN AUC 85.95 # 2
AP 87.77 # 3
Dynamic Link Prediction DBLP Temporal SI-VGRNN AUC 85.45 # 3
AP 88.36 # 2
Dynamic Link Prediction Enron Emails VGRNN AUC 93.29 # 2
AP 93.10 # 3
Dynamic Link Prediction Enron Emails SI-VGRNN AUC 94.44 # 1
AP 93.93 # 2

Methods


No methods listed for this paper. Add relevant methods here