DyTSCL: Dynamic graph representation via tempo-structural contrastive learning

journal 2023  ·  Jianian Li, Peng Bao, Rong Yan, HuaWei Shen ·

With the massive growth of graph-structured data, extensive research has focused on graph representation learning. Recently, graph representation learning frameworks have made great efforts toward dynamic graph learning. Although dynamic graph methods have achieved impressive results, they require labeled data for model training. The contrastive learning does not require human annotation to complete model training and has been shown to be extremely competitive in visual representation learning and natural language processing. In this paper, we propose a novel Dynamic graph representation framework via Tempo-Structural Contrastive Learning, DyTSCL, which trains the model by identifying three different subgraphs as a task, named Tempo-Structural subgraph, Non-Temporal subgraph and Non-Structural subgraph. Moreover, we propose a Tempo-Structural encoder, which aggregates the temporal and structural information. Finally, a Tempo-Structural contrastive learning module is proposed to maximize the consistency between node and subgraph in temporal and structural perspectives, respectively. To demonstrate the effectiveness of DyTSCL, we validate DyTSCL by applying it on the Wikipedia, Reddit and Mooc datasets, which show that DyTSCL can significantly outperform the existing approaches.

PDF

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods