Search Results for author: Yiling Yuan

Found 2 papers, 1 papers with code

RELIANT: Fair Knowledge Distillation for Graph Neural Networks

1 code implementation3 Jan 2023 Yushun Dong, Binchi Zhang, Yiling Yuan, Na Zou, Qi Wang, Jundong Li

Knowledge Distillation (KD) is a common solution to compress GNNs, where a light-weighted model (i. e., the student model) is encouraged to mimic the behavior of a computationally expensive GNN (i. e., the teacher GNN model).

Fairness Graph Learning +1

Traffic-Aware Transmission Mode Selection in D2D-enabled Cellular Networks with Token System

no code implementations2 Mar 2017 Yiling Yuan, Tao Yang, Hui Feng, Bo Hu, Jianqiu Zhang, Bin Wang, Qiyong Lu

We consider a D2D-enabled cellular network where user equipments (UEs) owned by rational users are incentivized to form D2D pairs using tokens.

Cannot find the paper you are looking for? You can Submit a new open access paper.