Search Results for author: Jiongyu Guo

Found 2 papers, 1 papers with code

Online Cross-Layer Knowledge Distillation on Graph Neural Networks with Deep Supervision

no code implementations25 Oct 2022 Jiongyu Guo, Defang Chen, Can Wang

Alignahead++ transfers structure and feature information in a student layer to the previous layer of another simultaneously trained student model in an alternating training procedure.

Knowledge Distillation Model Compression

Alignahead: Online Cross-Layer Knowledge Extraction on Graph Neural Networks

1 code implementation5 May 2022 Jiongyu Guo, Defang Chen, Can Wang

Existing knowledge distillation methods on graph neural networks (GNNs) are almost offline, where the student model extracts knowledge from a powerful teacher model to improve its performance.

Knowledge Distillation

Cannot find the paper you are looking for? You can Submit a new open access paper.