1 code implementation • 8 Nov 2024 • Kai Zhao, Xuhao Li, Qiyu Kang, Feng Ji, Qinxu Ding, Yanan Zhao, Wenfei Liang, Wee Peng Tay
We introduce the Distributed-order fRActional Graph Operating Network (DRAGON), a novel continuous Graph Neural Network (GNN) framework that incorporates distributed-order fractional calculus.
no code implementations • 6 Sep 2024 • Yanan Zhao, Xingchao Jian, Feng Ji, Wee Peng Tay, Antonio Ortega
We introduce a novel uncertainty principle for generalized graph signals that extends classical time-frequency and graph uncertainty principles into a unified framework.
no code implementations • 25 May 2024 • Wenfei Liang, Yanan Zhao, Rui She, Yiming Li, Wee Peng Tay
Personalized subgraph Federated Learning (FL) is a task that customizes Graph Neural Networks (GNNs) to individual client needs, accommodating diverse data distributions.
1 code implementation • 20 Feb 2024 • Yanan Zhao, Yuelong Li, Haichuan Zhang, Vishal Monga, Yonina C. Eldar
Through extensive experimental studies, we verify that our approach achieves competitive performance with state-of-the-art unrolled layer-specific learning and significantly improves over the traditional HQS algorithm.
no code implementations • 9 Jan 2024 • Qiyu Kang, Kai Zhao, Yang song, Yihang Xie, Yanan Zhao, Sijie Wang, Rui She, Wee Peng Tay
In this work, we rigorously investigate the robustness of graph neural fractional-order differential equation (FDE) models.
no code implementations • 22 Apr 2023 • Gong Chen, Yanan Zhao, Yi Wang, Kim-Hui Yap
Recently, synthetic aperture radar (SAR) image change detection has become an interesting yet challenging direction due to the presence of speckle noise.
1 code implementation • 22 Feb 2022 • Jingyi Xu, Zirui Li, Li Gao, Junyi Ma, Qi Liu, Yanan Zhao
Different exploration methods of DRL, including adding action space noise and parameter space noise, are compared against each other in the transfer learning process in this work.