Search Results for author: Qinfen Hao

Found 4 papers, 2 papers with code

Graph-based Knowledge Distillation: A survey and experimental evaluation

1 code implementation27 Feb 2023 Jing Liu, Tongya Zheng, Guanzheng Zhang, Qinfen Hao

It then provides a comprehensive summary of three types of Graph-based Knowledge Distillation methods, namely Graph-based Knowledge Distillation for deep neural networks (DKD), Graph-based Knowledge Distillation for GNNs (GKD), and Self-Knowledge Distillation based Graph-based Knowledge Distillation (SKD).

Self-Knowledge Distillation

Long Short-Term Preference Modeling for Continuous-Time Sequential Recommendation

no code implementations1 Aug 2022 Huixuan Chi, Hao Xu, Hao Fu, Mengya Liu, Mengdi Zhang, Yuji Yang, Qinfen Hao, Wei Wu

In particular: 1) existing methods do not explicitly encode and capture the evolution of short-term preference as sequential methods do; 2) simply using last few interactions is not enough for modeling the changing trend.

Sequential Recommendation

HIRE: Distilling High-order Relational Knowledge From Heterogeneous Graph Neural Networks

no code implementations25 Jul 2022 Jing Liu, Tongya Zheng, Qinfen Hao

To the best of our knowledge, we are the first to propose a HIgh-order RElational (HIRE) knowledge distillation framework on heterogeneous graphs, which can significantly boost the prediction performance regardless of model architectures of HGNNs.

Knowledge Distillation Vocal Bursts Intensity Prediction

Residual Network and Embedding Usage: New Tricks of Node Classification with Graph Convolutional Networks

4 code implementations18 May 2021 Huixuan Chi, Yuying Wang, Qinfen Hao, Hong Xia

Graph Convolutional Networks (GCNs) and subsequent variants have been proposed to solve tasks on graphs, especially node classification tasks.

Node Classification Node Property Prediction

Cannot find the paper you are looking for? You can Submit a new open access paper.