Search Results for author: Kechi Zhang

Found 9 papers, 6 papers with code

FAN: Fourier Analysis Networks

2 code implementations3 Oct 2024 Yihong Dong, Ge Li, Yongding Tao, Xue Jiang, Kechi Zhang, Jia Li, Jing Su, Jun Zhang, Jingjing Xu

Despite the remarkable success achieved by neural networks, particularly those represented by MLP and Transformer, we reveal that they exhibit potential flaws in the modeling and reasoning of periodicity, i. e., they tend to memorize the periodic data rather than genuinely understanding the underlying principles of periodicity.

Language Modelling Time Series Forecasting

Dual Latent State Learning: Exploiting Regional Network Similarities for QoS Prediction

no code implementations7 Oct 2023 Ziliang Wang, Xiaohong Zhang, Kechi Zhang, Ze Shi Li, Meng Yan

Individual objects, whether users or services, within a specific region often exhibit similar network states due to their shared origin from the same city or autonomous system (AS).

Self-Edit: Fault-Aware Code Editor for Code Generation

1 code implementation6 May 2023 Kechi Zhang, Zhuo Li, Jia Li, Ge Li, Zhi Jin

Inspired by the process of human programming, we propose a generate-and-edit approach named Self-Edit that utilizes execution results of the generated code from LLMs to improve the code quality on the competitive programming task.

Code Generation HumanEval

Implant Global and Local Hierarchy Information to Sequence based Code Representation Models

1 code implementation14 Mar 2023 Kechi Zhang, Zhuo Li, Zhi Jin, Ge Li

Furthermore, we propose the Hierarchy Transformer (HiT), a simple but effective sequence model to incorporate the complete hierarchical embeddings of source code into a Transformer model.

CodeEditor: Learning to Edit Source Code with Pre-trained Models

1 code implementation31 Oct 2022 Jia Li, Ge Li, Zhuo Li, Zhi Jin, Xing Hu, Kechi Zhang, Zhiyi Fu

Pre-trained models are first pre-trained with pre-training tasks and fine-tuned with the code editing task.

Language Modelling Masked Language Modeling

Learning Program Representations with a Tree-Structured Transformer

1 code implementation18 Aug 2022 Wenhan Wang, Kechi Zhang, Ge Li, Shangqing Liu, Anran Li, Zhi Jin, Yang Liu

Learning vector representations for programs is a critical step in applying deep learning techniques for program understanding tasks.

Representation Learning

What does Transformer learn about source code?

no code implementations18 Jul 2022 Kechi Zhang, Ge Li, Zhi Jin

In the field of source code processing, the transformer-based representation models have shown great powerfulness and have achieved state-of-the-art (SOTA) performance in many tasks.

Variable misuse

Learning to Represent Programs with Heterogeneous Graphs

no code implementations8 Dec 2020 Kechi Zhang, Wenhan Wang, Huangzhao Zhang, Ge Li, Zhi Jin

To address the information of node and edge types, we bring the idea of heterogeneous graphs to learning on source code and present a new formula of building heterogeneous program graphs from ASTs with additional type information for nodes and edges.

Code Comment Generation Comment Generation

Cannot find the paper you are looking for? You can Submit a new open access paper.