Search Results for author: Qianlong Wen

Found 5 papers, 2 papers with code

Self-Supervised Graph Structure Refinement for Graph Neural Networks

1 code implementation12 Nov 2022 Jianan Zhao, Qianlong Wen, Mingxuan Ju, Chuxu Zhang, Yanfang Ye

Specifically, The pre-training phase aims to comprehensively estimate the underlying graph structure by a multi-view contrastive learning framework with both intra- and inter-view link prediction tasks.

Contrastive Learning Graph structure learning +1

Multi-task Self-supervised Graph Neural Networks Enable Stronger Task Generalization

1 code implementation5 Oct 2022 Mingxuan Ju, Tong Zhao, Qianlong Wen, Wenhao Yu, Neil Shah, Yanfang Ye, Chuxu Zhang

Besides, we observe that learning from multiple philosophies enhances not only the task generalization but also the single task performances, demonstrating that PARETOGNN achieves better task generalization via the disjoint yet complementary knowledge learned from different philosophies.

Link Prediction Node Classification +4

Diving into Unified Data-Model Sparsity for Class-Imbalanced Graph Representation Learning

no code implementations1 Oct 2022 Chunhui Zhang, Chao Huang, Yijun Tian, Qianlong Wen, Zhongyu Ouyang, Youhuan Li, Yanfang Ye, Chuxu Zhang

The effectiveness is further guaranteed and proved by the gradients' distance between the subset and the full set; (ii) empirically, we discover that during the learning process of a GNN, some samples in the training dataset are informative for providing gradients to update model parameters.

Contrastive Learning Graph Representation Learning

Graph Contrastive Learning with Cross-view Reconstruction

no code implementations16 Sep 2022 Qianlong Wen, Zhongyu Ouyang, Chunhui Zhang, Yiyue Qian, Yanfang Ye, Chuxu Zhang

In light of this, we introduce the Graph Contrastive Learning with Cross-View Reconstruction (GraphCV), which follows the information bottleneck principle to learn minimal yet sufficient representation from graph data.

Contrastive Learning Disentanglement +3

Gophormer: Ego-Graph Transformer for Node Classification

no code implementations25 Oct 2021 Jianan Zhao, Chaozhuo Li, Qianlong Wen, Yiqi Wang, Yuming Liu, Hao Sun, Xing Xie, Yanfang Ye

Existing graph transformer models typically adopt fully-connected attention mechanism on the whole input graph and thus suffer from severe scalability issues and are intractable to train in data insufficient cases.

Classification Data Augmentation +3

Cannot find the paper you are looking for? You can Submit a new open access paper.