1 code implementation • 24 Mar 2024 • Ziwen Zhao, Yuhua Li, Yixiong Zou, Ruixuan Li, Rui Zhang
Graph self-supervised learning is now a go-to method for pre-training graph foundation models, including graph neural networks, graph transformers, and more recent large language model (LLM)-based graph models.
1 code implementation • 6 Feb 2024 • Ziwen Zhao, Yuhua Li, Yixiong Zou, Jiliang Tang, Ruixuan Li
Inspired by these understandings, we explore non-discrete edge masks, which are sampled from a continuous and dispersive probability distribution instead of the discrete Bernoulli distribution.
1 code implementation • 8 May 2023 • Han Chen, Ziwen Zhao, Yuhua Li, Yixiong Zou, Ruixuan Li, Rui Zhang
Graph Contrastive Learning (GCL) is an effective way to learn generalized graph representations in a self-supervised manner, and has grown rapidly in recent years.