no code implementations • 7 Apr 2022 • Ensheng Shi, Wenchao Gub, Yanlin Wang, Lun Du, Hongyu Zhang, Shi Han, Dongmei Zhang, Hongbin Sun
In this paper, we propose a new approach with multimodal contrastive learning and soft data augmentation for code search.
no code implementations • ACL 2022 • Wenchao Gu, Yanlin Wang, Lun Du, Hongyu Zhang, Shi Han, Dongmei Zhang, Michael R. Lyu
Code search is to search reusable code snippets from source code corpus based on natural languages queries.
no code implementations • 5 Mar 2022 • Ensheng Shia, Yanlin Wang, Lun Du, Hongyu Zhang, Shi Han, Dongmei Zhang, Hongbin Sun
The information retrieval-based methods reuse the commit messages of similar code diffs, while the neural-based methods learn the semantic connection between code diffs and commit messages.
no code implementations • 25 Feb 2022 • Chongjian Yue, Lun Du, Qiang Fu, Wendong Bi, Hengyu Liu, Yu Gu, Di Yao
The Temporal Link Prediction task of WSDM Cup 2022 expects a single model that can work well on two kinds of temporal graphs simultaneously, which have quite different characteristics and data properties, to predict whether a link of a given type will occur between two given nodes within a given time span.
no code implementations • 2 Dec 2021 • Haitao Mao, Lun Du, Yujia Zheng, Qiang Fu, Zelin Li, Xu Chen, Shi Han, Dongmei Zhang
They utilize labels from the source domain as the supervision signal and are jointly trained on both the source graph and the target graph.
no code implementations • 30 Nov 2021 • Qiang Fu, Lun Du, Haitao Mao, Xu Chen, Wei Fang, Shi Han, Dongmei Zhang
Regularization can mitigate the generalization gap between training and inference by introducing inductive bias.
1 code implementation • 29 Oct 2021 • Lun Du, Xiaozhou Shi, Qiang Fu, Xiaojun Ma, Hengyu Liu, Shi Han, Dongmei Zhang
For node-level tasks, GNNs have strong power to model the homophily property of graphs (i. e., connected nodes are more similar) while their ability to capture the heterophily property is often doubtful.
1 code implementation • 14 Aug 2021 • Haitao Mao, Xu Chen, Qiang Fu, Lun Du, Shi Han, Dongmei Zhang
Initialization plays a critical role in the training of deep neural networks (DNN).
1 code implementation • 15 Jul 2021 • Ensheng Shi, Yanlin Wang, Lun Du, Junjie Chen, Shi Han, Hongyu Zhang, Dongmei Zhang, Hongbin Sun
To achieve a profound understanding of how far we are from solving this problem and provide suggestions to future research, in this paper, we conduct a systematic and in-depth analysis of 5 state-of-the-art neural code summarization models on 6 widely used BLEU variants, 4 pre-processing operations and their combinations, and 3 widely used datasets.
1 code implementation • 12 Jul 2021 • Wei Tao, Yanlin Wang, Ensheng Shi, Lun Du, Shi Han, Hongyu Zhang, Dongmei Zhang, Wenqiang Zhang
We find that: (1) Different variants of the BLEU metric are used in previous works, which affects the evaluation and understanding of existing methods.
1 code implementation • 10 Jul 2021 • Lun Du, Xiaozhou Shi, Yanlin Wang, Ensheng Shi, Shi Han, Dongmei Zhang
On the other hand, as a specific query may focus on one or several perspectives, it is difficult for a single query representation module to represent different user intents.
no code implementations • 6 Jun 2021 • Lun Du, Fei Gao, Xu Chen, Ran Jia, Junshan Wang, Jiang Zhang, Shi Han, Dongmei Zhang
To simultaneously extract spatial and relational information from tables, we propose a novel neural network architecture, TabularNet.
no code implementations • 17 May 2021 • Lun Du, Xu Chen, Fei Gao, Kunqing Xie, Shi Han, Dongmei Zhang
Network Embedding aims to learn a function mapping the nodes to Euclidean space contribute to multiple learning analysis tasks on networks.
no code implementations • 30 Nov 2020 • Xu Chen, Yuanxing Zhang, Lun Du, Zheng Fang, Yi Ren, Kaigui Bian, Kunqing Xie
Further analysis indicates that the locality and globality of the traffic networks are critical to traffic flow prediction and the proposed TSSRGCN model can adapt to the various temporal traffic patterns.
no code implementations • 18 Jan 2020 • Mengyuan Chen, Jiang Zhang, Zhang Zhang, Lun Du, Qiao Hu, Shuo Wang, Jiaqi Zhu
We carried out experiments on discrete and continuous time series data.
1 code implementation • 3 Jun 2019 • Yizhou Zhang, Guojie Song, Lun Du, Shu-wen Yang, Yilun Jin
Recent works reveal that network embedding techniques enable many machine learning models to handle diverse downstream tasks on graph structured data.
no code implementations • 19 Apr 2019 • Junshan Wang, Zhicong Lu, Guojie Song, Yue Fan, Lun Du, Wei. Lin
Network embedding is a method to learn low-dimensional representation vectors for nodes in complex networks.