1 code implementation • 25 Aug 2023 • Ensheng Shi, Fengji Zhang, Yanlin Wang, Bei Chen, Lun Du, Hongyu Zhang, Shi Han, Dongmei Zhang, Hongbin Sun
To meet the demands of this dynamic field, there is a growing need for an effective software development assistant.
1 code implementation • 6 Jun 2023 • Jiayan Guo, Lun Du, Xu Chen, Xiaojun Ma, Qiang Fu, Shi Han, Dongmei Zhang, Yan Zhang
Graph CF has attracted more and more attention in recent years due to its effectiveness in leveraging high-order information in the user-item bipartite graph for better recommendations.
no code implementations • 24 May 2023 • Chongjian Yue, Xinrun Xu, Xiaojun Ma, Lun Du, Hengyu Liu, Zhiming Ding, Yanbing Jiang, Shi Han, Dongmei Zhang
We propose an Automated Financial Information Extraction (AFIE) framework that enhances LLMs' ability to comprehend and extract information from financial reports.
no code implementations • 24 May 2023 • Jiayan Guo, Lun Du, Hengyu Liu, Mengyu Zhou, Xinyi He, Shi Han
In this study, we conduct an extensive investigation to assess the proficiency of LLMs in comprehending graph data, employing a diverse range of structural and semantic-related tasks.
no code implementations • 22 May 2023 • Hongjun Wang, Jiyuan Chen, Lun Du, Qiang Fu, Shi Han, Xuan Song
Recent years have witnessed the great potential of attention mechanism in graph representation learning.
1 code implementation • 11 Apr 2023 • Ensheng Shi, Yanlin Wang, Hongyu Zhang, Lun Du, Shi Han, Dongmei Zhang, Hongbin Sun
Our experimental study shows that (1) lexical, syntactic and structural properties of source code are encoded in the lower, intermediate, and higher layers, respectively, while the semantic property spans across the entire model.
1 code implementation • 16 Feb 2023 • Jincheng Huang, Lun Du, Xu Chen, Qiang Fu, Shi Han, Dongmei Zhang
Theoretical analyses guarantee the robustness of signals through the mid-pass filter, and we also shed light on the properties of different frequency signals under adversarial attacks.
no code implementations • 13 Feb 2023 • Jiayan Guo, Lun Du, Wendong Bi, Qiang Fu, Xiaojun Ma, Xu Chen, Shi Han, Dongmei Zhang, Yan Zhang
To this end, we propose HDHGR, a homophily-oriented deep heterogeneous graph rewiring approach that modifies the HG structure to increase the performance of HGNN.
no code implementations • 22 Oct 2022 • Feifan Li, Lun Du, Qiang Fu, Shi Han, Yushu Du, Guangming Lu, Zi Li
Furthermore, based on the dynamic user intent representations, we propose a meta predictor to perform differentiated user engagement forecasting.
no code implementations • 4 Oct 2022 • Lunyiu Nie, Jiuding Sun, Yanlin Wang, Lun Du, Lei Hou, Juanzi Li, Shi Han, Dongmei Zhang, Jidong Zhai
The recent prevalence of pretrained language models (PLMs) has dramatically shifted the paradigm of semantic parsing, where the mapping from natural language utterances to structured logical forms is now formulated as a Seq2Seq task.
no code implementations • 17 Sep 2022 • Wendong Bi, Lun Du, Qiang Fu, Yanlin Wang, Shi Han, Dongmei Zhang
Graph Neural Networks (GNNs) are popular machine learning methods for modeling graph data.
Ranked #5 on
Node Classification
on Squirrel
no code implementations • 25 Aug 2022 • Hengyu Liu, Qiang Fu, Lun Du, Tiancheng Zhang, Ge Yu, Shi Han, Dongmei Zhang
Learning rate is one of the most important hyper-parameters that has a significant influence on neural network training.
1 code implementation • 15 Aug 2022 • Wendong Bi, Lun Du, Qiang Fu, Yanlin Wang, Shi Han, Dongmei Zhang
Graph Neural Networks (GNNs) have shown expressive performance on graph representation learning by aggregating information from neighbors.
no code implementations • 7 Apr 2022 • Ensheng Shi, Yanlin Wang, Wenchao Gu, Lun Du, Hongyu Zhang, Shi Han, Dongmei Zhang, Hongbin Sun
However, there is still a lot of room for improvement in using contrastive learning for code search.
no code implementations • ACL 2022 • Wenchao Gu, Yanlin Wang, Lun Du, Hongyu Zhang, Shi Han, Dongmei Zhang, Michael R. Lyu
Code search is to search reusable code snippets from source code corpus based on natural languages queries.
2 code implementations • 5 Mar 2022 • Ensheng Shi, Yanlin Wang, Wei Tao, Lun Du, Hongyu Zhang, Shi Han, Dongmei Zhang, Hongbin Sun
Furthermore, RACE can boost the performance of existing Seq2Seq models in commit message generation.
no code implementations • 25 Feb 2022 • Chongjian Yue, Lun Du, Qiang Fu, Wendong Bi, Hengyu Liu, Yu Gu, Di Yao
The Temporal Link Prediction task of WSDM Cup 2022 expects a single model that can work well on two kinds of temporal graphs simultaneously, which have quite different characteristics and data properties, to predict whether a link of a given type will occur between two given nodes within a given time span.
no code implementations • 2 Dec 2021 • Haitao Mao, Lun Du, Yujia Zheng, Qiang Fu, Zelin Li, Xu Chen, Shi Han, Dongmei Zhang
They utilize labels from the source domain as the supervision signal and are jointly trained on both the source graph and the target graph.
no code implementations • 30 Nov 2021 • Qiang Fu, Lun Du, Haitao Mao, Xu Chen, Wei Fang, Shi Han, Dongmei Zhang
Based on the analysis results, we articulate the Neuron Steadiness Hypothesis: the neuron with similar responses to instances of the same class leads to better generalization.
1 code implementation • 29 Oct 2021 • Lun Du, Xiaozhou Shi, Qiang Fu, Xiaojun Ma, Hengyu Liu, Shi Han, Dongmei Zhang
For node-level tasks, GNNs have strong power to model the homophily property of graphs (i. e., connected nodes are more similar) while their ability to capture the heterophily property is often doubtful.
1 code implementation • 14 Aug 2021 • Haitao Mao, Xu Chen, Qiang Fu, Lun Du, Shi Han, Dongmei Zhang
Initialization plays a critical role in the training of deep neural networks (DNN).
1 code implementation • 15 Jul 2021 • Ensheng Shi, Yanlin Wang, Lun Du, Junjie Chen, Shi Han, Hongyu Zhang, Dongmei Zhang, Hongbin Sun
To achieve a profound understanding of how far we are from solving this problem and provide suggestions to future research, in this paper, we conduct a systematic and in-depth analysis of 5 state-of-the-art neural code summarization models on 6 widely used BLEU variants, 4 pre-processing operations and their combinations, and 3 widely used datasets.
1 code implementation • 12 Jul 2021 • Wei Tao, Yanlin Wang, Ensheng Shi, Lun Du, Shi Han, Hongyu Zhang, Dongmei Zhang, Wenqiang Zhang
We find that: (1) Different variants of the BLEU metric are used in previous works, which affects the evaluation and understanding of existing methods.
1 code implementation • 10 Jul 2021 • Lun Du, Xiaozhou Shi, Yanlin Wang, Ensheng Shi, Shi Han, Dongmei Zhang
On the other hand, as a specific query may focus on one or several perspectives, it is difficult for a single query representation module to represent different user intents.
no code implementations • 6 Jun 2021 • Lun Du, Fei Gao, Xu Chen, Ran Jia, Junshan Wang, Jiang Zhang, Shi Han, Dongmei Zhang
To simultaneously extract spatial and relational information from tables, we propose a novel neural network architecture, TabularNet.
no code implementations • 17 May 2021 • Lun Du, Xu Chen, Fei Gao, Kunqing Xie, Shi Han, Dongmei Zhang
Network Embedding aims to learn a function mapping the nodes to Euclidean space contribute to multiple learning analysis tasks on networks.
no code implementations • 30 Nov 2020 • Xu Chen, Yuanxing Zhang, Lun Du, Zheng Fang, Yi Ren, Kaigui Bian, Kunqing Xie
Further analysis indicates that the locality and globality of the traffic networks are critical to traffic flow prediction and the proposed TSSRGCN model can adapt to the various temporal traffic patterns.
no code implementations • 18 Jan 2020 • Mengyuan Chen, Jiang Zhang, Zhang Zhang, Lun Du, Qiao Hu, Shuo Wang, Jiaqi Zhu
We carried out experiments on discrete and continuous time series data.
2 code implementations • 3 Jun 2019 • Yizhou Zhang, Guojie Song, Lun Du, Shu-wen Yang, Yilun Jin
Recent works reveal that network embedding techniques enable many machine learning models to handle diverse downstream tasks on graph structured data.
no code implementations • 19 Apr 2019 • Junshan Wang, Zhicong Lu, Guojie Song, Yue Fan, Lun Du, Wei. Lin
Network embedding is a method to learn low-dimensional representation vectors for nodes in complex networks.