1 code implementation • 27 Feb 2023 • Jing Liu, Tongya Zheng, Guanzheng Zhang, Qinfen Hao
It then provides a comprehensive summary of three types of Graph-based Knowledge Distillation methods, namely Graph-based Knowledge Distillation for deep neural networks (DKD), Graph-based Knowledge Distillation for GNNs (GKD), and Self-Knowledge Distillation based Graph-based Knowledge Distillation (SKD).
no code implementations • 1 Aug 2022 • Huixuan Chi, Hao Xu, Hao Fu, Mengya Liu, Mengdi Zhang, Yuji Yang, Qinfen Hao, Wei Wu
In particular: 1) existing methods do not explicitly encode and capture the evolution of short-term preference as sequential methods do; 2) simply using last few interactions is not enough for modeling the changing trend.
no code implementations • 25 Jul 2022 • Jing Liu, Tongya Zheng, Qinfen Hao
To the best of our knowledge, we are the first to propose a HIgh-order RElational (HIRE) knowledge distillation framework on heterogeneous graphs, which can significantly boost the prediction performance regardless of model architectures of HGNNs.
4 code implementations • 18 May 2021 • Huixuan Chi, Yuying Wang, Qinfen Hao, Hong Xia
Graph Convolutional Networks (GCNs) and subsequent variants have been proposed to solve tasks on graphs, especially node classification tasks.
Ranked #22 on Node Property Prediction on ogbn-mag