no code implementations • 10 Jan 2025 • Wei Ruan, Yanjun Lyu, Jing Zhang, Jiazhang Cai, Peng Shu, Yang Ge, Yao Lu, Shang Gao, Yue Wang, Peilong Wang, Lin Zhao, Tao Wang, Yufang Liu, Luyang Fang, Ziyu Liu, Zhengliang Liu, Yiwei Li, Zihao Wu, JunHao Chen, Hanqi Jiang, Yi Pan, Zhenyuan Yang, Jingyuan Chen, Shizhe Liang, Wei zhang, Terry Ma, Yuan Dou, Jianli Zhang, Xinyu Gong, Qi Gan, Yusong Zou, Zebang Chen, Yuanxin Qian, Shuo Yu, Jin Lu, Kenan Song, Xianqiao Wang, Andrea Sikora, Gang Li, Xiang Li, Quanzheng Li, Yingfeng Wang, Lu Zhang, Yohannes Abate, Lifang He, Wenxuan Zhong, Rongjie Liu, Chao Huang, Wei Liu, Ye Shen, Ping Ma, Hongtu Zhu, Yajun Yan, Dajiang Zhu, Tianming Liu
With the rapid advancements in large language model (LLM) technology and the emergence of bioinformatics-specific language models (BioLMs), there is a growing need for a comprehensive analysis of the current landscape, computational characteristics, and diverse applications.
1 code implementation • 2 Jan 2025 • Shuo Yu, Shan Jin, Ming Li, Tabinda Sarwar, Feng Xia
Furthermore, by employing the transformer framework, ALERT adaptively integrates both short- and long-range dependencies between brain ROIs, enabling an integrated understanding of multi-level communication across the entire brain.
no code implementations • 2 Jan 2025 • Shuo Yu, Yingbo Wang, Ruolin Li, Guchun Liu, Yanming Shen, Shaoxiong Ji, Bowen Li, Fengling Han, Xiuzhen Zhang, Feng Xia
In this paper, we present a comprehensive review of methodologies for applying LLMs to graphs, termed LLM4graph.
1 code implementation • 9 Sep 2024 • Jie Ouyang, Yucong Luo, Mingyue Cheng, Daoyu Wang, Shuo Yu, Qi Liu, Enhong Chen
This paper presents the solution of our team APEX in the Meta KDD CUP 2024: CRAG Comprehensive RAG Benchmark Challenge.
2 code implementations • 3 Sep 2024 • Shuo Yu, Mingyue Cheng, Jiqian Yang, Jie Ouyang, Yucong Luo, Chenyi Lei, Qi Liu, Enhong Chen
Retrieval-augmented generation (RAG) is increasingly recognized as an effective approach for mitigating the hallucination of large language models (LLMs) through the integration of external knowledge.
no code implementations • 13 Jul 2024 • Ahsan Shehzad, Feng Xia, Shagufta Abid, Ciyuan Peng, Shuo Yu, Dongyu Zhang, Karin Verspoor
Graph transformers are a recent advancement in machine learning, offering a new class of neural network models for graph-structured data.
no code implementations • 7 Jul 2024 • Shaoyuan Chen, Linlin You, Rui Liu, Shuo Yu, Ahmed M. Abdelmoniem
Compared to the solutions based on centralized data centers, updating large models in the Internet of Things (IoT) faces challenges in coordinating knowledge from distributed clients by using their private and heterogeneous data.
1 code implementation • 20 Jun 2024 • Erkang Jing, Yezheng Liu, Yidong Chai, Shuo Yu, Longshun Liu, Yuanchun Jiang, Yang Wang
Ablation studies and case studies further validate the effectiveness of our HDBN.
no code implementations • 7 Jun 2024 • Zhen Cai, Tao Tang, Shuo Yu, Yunpeng Xiao, Feng Xia
We design a blockchain-based traceability mechanism, ensuring data privacy during data sharing and model updates.
no code implementations • 7 Jun 2024 • Xu Yuan, Na Zhou, Shuo Yu, Huafei Huang, Zhikui Chen, Feng Xia
Such patterns can be modeled by higher-order network structures, thus benefiting anomaly detection on attributed networks.
no code implementations • 7 Jun 2024 • Shuo Yu, Feng Xia, Yueru Wang, Shihao Li, Falih Febrinanto, Madhu Chetty
To address the current limitations, we propose a deep graph learning model, called PANDORA, to predict the infection risks of COVID-19, by considering all essential factors and integrating them into a geographical network.
no code implementations • 7 Jun 2024 • Shuo Yu, Fayez Alqahtani, Amr Tolba, Ivan Lee, Tao Jia, Feng Xia
The simulation results indicate that CORE reveals inner patterns of scientific collaboration: senior scholars have broad collaborative relationships and fixed collaboration patterns, which are the underlying mechanisms of team assembly.
no code implementations • 27 May 2024 • Renqiang Luo, Huafei Huang, Shuo Yu, Zhuoyang Han, Estrid He, Xiuzhen Zhang, Feng Xia
We explore the correlation between sensitive features and spectrum in GNNs, using theoretical analysis to delineate the similarity between original sensitive features and those after convolution under different spectra.
1 code implementation • 26 Apr 2024 • Renqiang Luo, Huafei Huang, Shuo Yu, Xiuzhen Zhang, Feng Xia
The design of Graph Transformers (GTs) generally neglects considerations for fairness, resulting in biased outcomes against certain sensitive subgroups.
no code implementations • 6 Feb 2024 • Huiling Tu, Shuo Yu, Vidya Saikrishna, Feng Xia, Karin Verspoor
Knowledge graphs (KGs) have garnered significant attention for their vast potential across diverse domains.
no code implementations • 25 Jan 2024 • Lei Liu, Shuo Yu, Runze Wang, Zhenxun Ma, Yanming Shen
We tackle the data mismatch by proposing: 1) STG-Tokenizer: This spatial-temporal graph tokenizer transforms intricate graph data into concise tokens capturing both spatial and temporal relationships; 2) STG-Adapter: This minimalistic adapter, consisting of linear encoding and decoding layers, bridges the gap between tokenized data and LLM comprehension.
no code implementations • 12 Jun 2023 • Feng Xia, Xin Chen, Shuo Yu, Mingliang Hou, Mujie Liu, Linlin You
To address this issue, we propose a coupled attention-based neural network framework (CAN) for anomaly detection in multivariate time series data featuring dynamic variable relationships.
1 code implementation • 25 May 2023 • Shuo Yu, Hongyan Xue, Xiang Ao, Feiyang Pan, Jia He, Dandan Tu, Qing He
In practice, a set of formulaic alphas is often used together for better modeling precision, so we need to find synergistic formulaic alpha sets that work well together.
no code implementations • 2 Feb 2023 • Shuo Yu, Ciyuan Peng, Yingbo Wang, Ahsan Shehzad, Feng Xia, Edwin R. Hancock
However, facilitating quantum theory to enhance graph learning is in its infancy.
no code implementations • 6 Apr 2022 • Feng Xia, Shuo Yu, Chengfei Liu, Ivan Lee
In the first procedure, we propose to lower the network scale by optimizing the network structure with maximal k-edge-connected subgraphs.
1 code implementation • 17 Mar 2022 • Shuo Yu, Huafei Huang, Minh N. Dao, Feng Xia
To better show the outperformance of GAL, we experimentally validate the effectiveness and adaptability of different GAL strategies in different downstream tasks.
no code implementations • 3 May 2021 • Feng Xia, Ke Sun, Shuo Yu, Abdul Aziz, Liangtian Wan, Shirui Pan, Huan Liu
In this survey, we present a comprehensive overview on the state-of-the-art of graph learning.
no code implementations • 7 Mar 2021 • Ke Sun, Jiaying Liu, Shuo Yu, Bo Xu, Feng Xia
Features representation leverages the great power in network analysis tasks.
no code implementations • 27 Aug 2020 • Shuo Yu, Feng Xia, Jin Xu, Zhikui Chen, Ivan Lee
In order to assess the efficiency of the proposed framework, four popular network representation algorithms are modified and examined.
no code implementations • 9 Aug 2020 • Jin Xu, Shuo Yu, Ke Sun, Jing Ren, Ivan Lee, Shirui Pan, Feng Xia
Therefore, in graph learning tasks of social networks, the identification and utilization of multivariate relationship information are more important.
no code implementations • 9 Aug 2020 • Hayat Dino Bedru, Shuo Yu, Xinru Xiao, Da Zhang, Liangtian Wan, He guo, Feng Xia
This paper proposes a guideline framework that gives an insight into the major topics in the area of network science from the viewpoint of a big network.