1 code implementation • 1 Dec 2024 • Kay Liu, Jiahao Ding, MohamadAli Torkamani, Philip S. Yu
While Transformers have revolutionized machine learning on various data, existing Transformers for temporal graphs face limitations in (1) restricted receptive fields, (2) overhead of subgraph extraction, and (3) suboptimal generalization capability beyond link prediction.
no code implementations • 21 Oct 2024 • Haoyan Xu, Kay Liu, Zhengtao Yao, Philip S. Yu, Kaize Ding, Yue Zhao
Graph open-set learning (GOL) and out-of-distribution (OOD) detection aim to address this challenge by training models that can accurately classify known, in-distribution (ID) classes while identifying and handling previously unseen classes during inference.
1 code implementation • 12 Oct 2024 • Fangxin Wang, Kay Liu, Sourav Medya, Philip S. Yu
Graph self-training is a semi-supervised learning method that iteratively selects a set of unlabeled data to retrain the underlying graph neural network (GNN) model and improve its prediction performance.
1 code implementation • 8 Oct 2024 • Yuhang Yao, Yuan Li, Xinyi Fan, Junhao Li, Kay Liu, Weizhao Jin, Srivatsan Ravi, Philip S. Yu, Carlee Joe-Wong
Federated graph learning is an emerging field with significant practical challenges.
1 code implementation • 3 Jun 2024 • Wenjing Chang, Kay Liu, Philip S. Yu, Jianjun Yu
Graph anomaly detection (GAD) is increasingly crucial in various applications, ranging from financial fraud detection to fake news detection.
no code implementations • 11 Mar 2024 • Fangxin Wang, Yuqing Liu, Kay Liu, Yibo Wang, Sourav Medya, Philip S. Yu
Therefore, identifying, quantifying, and utilizing uncertainty are essential to enhance the performance of the model for the downstream tasks as well as the reliability of the GNN predictions.
no code implementations • 24 Feb 2024 • Qian Ma, Hongliang Chi, Hengrui Zhang, Kay Liu, Zhiwei Zhang, Lu Cheng, Suhang Wang, Philip S. Yu, Yao Ma
The rise of self-supervised learning, which operates without the need for labeled data, has garnered significant interest within the graph learning community.
no code implementations • 14 Feb 2024 • Chen Wang, Fangxin Wang, Ruocheng Guo, Yueqing Liang, Kay Liu, Philip S. Yu
Recognizing the critical role of confidence in aligning training objectives with evaluation metrics, we propose CPFT, a versatile framework that enhances recommendation confidence by integrating Conformal Prediction (CP)-based losses with CE loss during fine-tuning.
1 code implementation • 24 Jan 2024 • Wenjing Chang, Kay Liu, Kaize Ding, Philip S. Yu, Jianjun Yu
Firstly, by coupling node classification tasks, MITIGATE obtains the capability to detect out-of-distribution nodes without known anomalies.
1 code implementation • 29 Dec 2023 • Kay Liu, Hengrui Zhang, Ziqing Hu, Fangxin Wang, Philip S. Yu
To bridge this gap, we introduce GODM, a novel data augmentation for mitigating class imbalance in supervised Graph Outlier detection via latent Diffusion Models.
2 code implementations • 21 Jun 2022 • Kay Liu, Yingtong Dou, Yue Zhao, Xueying Ding, Xiyang Hu, Ruitong Zhang, Kaize Ding, Canyu Chen, Hao Peng, Kai Shu, Lichao Sun, Jundong Li, George H. Chen, Zhihao Jia, Philip S. Yu
To bridge this gap, we present--to the best of our knowledge--the first comprehensive benchmark for unsupervised outlier node detection on static attributed graphs called BOND, with the following highlights.
1 code implementation • 26 Apr 2022 • Kay Liu, Yingtong Dou, Xueying Ding, Xiyang Hu, Ruitong Zhang, Hao Peng, Lichao Sun, Philip S. Yu
PyGOD is an open-source Python library for detecting outliers in graph data.