Search Results for author: Zhi Jin

Found 66 papers, 23 papers with code

Convolutional Neural Networks over Tree Structures for Programming Language Processing

8 code implementations18 Sep 2014 Lili Mou, Ge Li, Lu Zhang, Tao Wang, Zhi Jin

Programming language processing (similar to natural language processing) is a hot research topic in the field of software engineering; it has also aroused growing interest in the artificial intelligence community.

Sentence

TACO: Topics in Algorithmic COde generation dataset

1 code implementation22 Dec 2023 Rongao Li, Jie Fu, Bo-Wen Zhang, Tao Huang, Zhihong Sun, Chen Lyu, Guang Liu, Zhi Jin, Ge Li

Moreover, each TACO problem includes several fine-grained labels such as task topics, algorithms, programming skills, and difficulty levels, providing a more precise reference for the training and evaluation of code generation models.

Code Generation

Building Program Vector Representations for Deep Learning

1 code implementation11 Sep 2014 Lili Mou, Ge Li, Yuxuan Liu, Hao Peng, Zhi Jin, Yan Xu, Lu Zhang

In this pioneering paper, we propose the "coding criterion" to build program vector representations, which are the premise of deep learning for program analysis.

Representation Learning

FourLLIE: Boosting Low-Light Image Enhancement by Fourier Frequency Information

1 code implementation6 Aug 2023 Chenxi Wang, Hongjun Wu, Zhi Jin

In the first stage, we improve the lightness of low-light images by estimating the amplitude transform map in the Fourier space.

Low-Light Image Enhancement

MB-TaylorFormer: Multi-branch Efficient Transformer Expanded by Taylor Formula for Image Dehazing

1 code implementation ICCV 2023 Yuwei Qiu, Kaihao Zhang, Chenxi Wang, Wenhan Luo, Hongdong Li, Zhi Jin

To address this issue, we propose a new Transformer variant, which applies the Taylor expansion to approximate the softmax-attention and achieves linear computational complexity.

Image Dehazing

Detecting Code Clones with Graph Neural Networkand Flow-Augmented Abstract Syntax Tree

1 code implementation20 Feb 2020 Wenhan Wang, Ge Li, Bo Ma, Xin Xia, Zhi Jin

As far as we have concerned, we are the first to apply graph neural networks on the domain of code clone detection.

Clone Detection

Code Generation as a Dual Task of Code Summarization

2 code implementations NeurIPS 2019 Bolin Wei, Ge Li, Xin Xia, Zhiyi Fu, Zhi Jin

Code summarization (CS) and code generation (CG) are two crucial tasks in the field of automatic software development.

Code Generation Code Summarization +1

Estimating Human Weight from A Single Image

1 code implementation IEEE Transactions on Multimedia 2022 Zhi Jin, Junjia Huang, Wenjin Wang, Aolin Xiong, Xiaojun Tan

In this case, the widely used Body Mass Index (BMI) which is associated with body height and weight can be employed as a measure of weight to indicate the health conditions.

Retrieve and Refine: Exemplar-based Neural Comment Generation

1 code implementation9 Oct 2020 Bolin Wei, Yongmin Li, Ge Li, Xin Xia, Zhi Jin

Inspired by the IR-based and template-based approaches, in this paper, we propose a neural comment generation approach where we use the existing comments of similar code snippets as exemplars to guide comment generation.

Code Comment Generation Comment Generation +4

Color Image Demosaicking Using a 3-Stage Convolutional Neural Network Structure

1 code implementation7 Oct 2018 Kai Cui, Zhi Jin, Eckehard Steinbach

Color demosaicking (CDM) is a critical first step for the acquisition of high-quality RGB images with single chip cameras.

Demosaicking

Compressing Neural Language Models by Sparse Word Representations

1 code implementation ACL 2016 Yunchuan Chen, Lili Mou, Yan Xu, Ge Li, Zhi Jin

Such approaches are time- and memory-intensive because of the large numbers of parameters for word embeddings and the output layer.

Language Modelling Word Embeddings

Fine-Tuning Pre-Trained Language Models Effectively by Optimizing Subnetworks Adaptively

1 code implementation3 Nov 2022 Haojie Zhang, Ge Li, Jia Li, Zhongjin Zhang, Yuqi Zhu, Zhi Jin

Large-scale pre-trained language models have achieved impressive results on a wide range of downstream tasks recently.

Language Modelling

Implant Global and Local Hierarchy Information to Sequence based Code Representation Models

1 code implementation14 Mar 2023 Kechi Zhang, Zhuo Li, Zhi Jin, Ge Li

Furthermore, we propose the Hierarchy Transformer (HiT), a simple but effective sequence model to incorporate the complete hierarchical embeddings of source code into a Transformer model.

Dynamic Implicit Image Function for Efficient Arbitrary-Scale Image Representation

1 code implementation21 Jun 2023 Zongyao He, Zhi Jin

We further propose a Coarse-to-Fine Multilayer Perceptron (C2F-MLP) to perform decoding with dynamic coordinate slicing, where the number of coordinates in each slice varies as the scale factor varies.

Computational Efficiency Super-Resolution

CodeEditor: Learning to Edit Source Code with Pre-trained Models

1 code implementation31 Oct 2022 Jia Li, Ge Li, Zhuo Li, Zhi Jin, Xing Hu, Kechi Zhang, Zhiyi Fu

Pre-trained models are first pre-trained with pre-training tasks and fine-tuned with the code editing task.

Language Modelling Masked Language Modeling

CodePAD: Sequence-based Code Generation with Pushdown Automaton

1 code implementation2 Nov 2022 Yihong Dong, Xue Jiang, Yuchen Liu, Ge Li, Zhi Jin

CodePAD can leverage existing sequence-based models, and we show that it can achieve 100\% grammatical correctness percentage on these benchmark datasets.

Code Generation Text Generation

Generalization or Memorization: Data Contamination and Trustworthy Evaluation for Large Language Models

1 code implementation24 Feb 2024 Yihong Dong, Xue Jiang, Huanyu Liu, Zhi Jin, Ge Li

CDD necessitates only the sampled texts to detect data contamination, by identifying the peakedness of LLM's output distribution.

Memorization

ZC3: Zero-Shot Cross-Language Code Clone Detection

1 code implementation26 Aug 2023 Chongyang Tao, Zhi Jin, Fang Liu, Jia Li, Ge Li

In this paper, we propose a novel method named ZC3 for Zero-shot Cross-language Code Clone detection.

Clone Detection Language Modelling

Hot or Cold? Adaptive Temperature Sampling for Code Generation with Large Language Models

1 code implementation6 Sep 2023 Yuqi Zhu, Ge Li, YunFei Zhao, Jia Li, Zhi Jin, Hong Mei

With an analysis of loss distributions of code tokens, we find that code tokens can be divided into two categories: challenging tokens that are difficult to predict and confident tokens that can be easily inferred.

Code Generation

Learning Program Representations with a Tree-Structured Transformer

1 code implementation18 Aug 2022 Wenhan Wang, Kechi Zhang, Ge Li, Shangqing Liu, Anran Li, Zhi Jin, Yang Liu

Learning vector representations for programs is a critical step in applying deep learning techniques for program understanding tasks.

Representation Learning

Compression of phase-only holograms with JPEG standard and deep learning

no code implementations11 Jun 2018 Shuming Jiao, Zhi Jin, Chenliang Chang, Changyuan Zhou, Wenbin Zou, Xia Li

It is a critical issue to reduce the enormous amount of data in the processing, storage and transmission of a hologram in digital format.

Coupling Distributed and Symbolic Execution for Natural Language Queries

no code implementations ICML 2017 Lili Mou, Zhengdong Lu, Hang Li, Zhi Jin

Building neural networks to query a knowledge base (a table) with natural language is an emerging research topic in deep learning.

Natural Language Queries

How Transferable are Neural Networks in NLP Applications?

no code implementations EMNLP 2016 Lili Mou, Zhao Meng, Rui Yan, Ge Li, Yan Xu, Lu Zhang, Zhi Jin

Transfer learning is aimed to make use of valuable knowledge in a source domain to help model performance in a target domain.

Transfer Learning

Improved Relation Classification by Deep Recurrent Neural Networks with Data Augmentation

no code implementations COLING 2016 Yan Xu, Ran Jia, Lili Mou, Ge Li, Yunchuan Chen, Yangyang Lu, Zhi Jin

However, existing neural networks for relation classification are usually of shallow architectures (e. g., one-layer convolutional neural networks or recurrent networks).

Classification Data Augmentation +3

Distilling Word Embeddings: An Encoding Approach

no code implementations15 Jun 2015 Lili Mou, Ran Jia, Yan Xu, Ge Li, Lu Zhang, Zhi Jin

Distilling knowledge from a well-trained cumbersome network to a small one has recently become a new research topic, as lightweight neural networks with high performance are particularly in need in various resource-restricted systems.

Word Embeddings

Backward and Forward Language Modeling for Constrained Sentence Generation

no code implementations21 Dec 2015 Lili Mou, Rui Yan, Ge Li, Lu Zhang, Zhi Jin

Provided a specific word, we use RNNs to generate previous words and future words, either simultaneously or asynchronously, resulting in two model variants.

Language Modelling Machine Translation +4

On End-to-End Program Generation from User Intention by Deep Neural Networks

no code implementations25 Oct 2015 Lili Mou, Rui Men, Ge Li, Lu Zhang, Zhi Jin

This paper envisions an end-to-end program generation scenario using recurrent neural networks (RNNs): Users can express their intention in natural language; an RNN then automatically generates corresponding code in a characterby-by-character fashion.

A Comparative Study on Regularization Strategies for Embedding-based Neural Networks

no code implementations EMNLP 2015 Hao Peng, Lili Mou, Ge Li, Yunchuan Chen, Yangyang Lu, Zhi Jin

This paper aims to compare different regularization strategies to address a common phenomenon, severe overfitting, in embedding-based neural networks for NLP.

Solving Pictorial Jigsaw Puzzle by Stigmergy-inspired Internet-based Human Collective Intelligence

no code implementations28 Nov 2018 Bo Shen, Wei zhang, Haiyan Zhao, Zhi Jin, Yanhong Wu

And through feedback, each player is provided with personalized feedback information based on the current COG and the player's exploration result, in order to accelerate his/her puzzle-solving process.

A Self-Attentional Neural Architecture for Code Completion with Multi-Task Learning

no code implementations16 Sep 2019 Fang Liu, Ge Li, Bolin Wei, Xin Xia, Zhiyi Fu, Zhi Jin

To enable the knowledge sharing between related tasks, we creatively propose a Multi-Task Learning (MTL) framework to learn two related tasks in code completion jointly.

Code Completion Language Modelling +1

Towards Full-line Code Completion with Neural Language Models

no code implementations18 Sep 2020 Wenhan Wang, Sijie Shen, Ge Li, Zhi Jin

In this paper, we take a further step and discuss the probability of directly completing a whole line of code instead of a single token.

Code Completion

Learning to Represent Programs with Heterogeneous Graphs

no code implementations8 Dec 2020 Kechi Zhang, Wenhan Wang, Huangzhao Zhang, Ge Li, Zhi Jin

To address the information of node and edge types, we bring the idea of heterogeneous graphs to learning on source code and present a new formula of building heterogeneous program graphs from ASTs with additional type information for nodes and edges.

Code Comment Generation Comment Generation

Massive Self-Assembly in Grid Environments

no code implementations5 Feb 2021 Wenjie Chu, Wei zhang, Haiyan Zhao, Zhi Jin, Hong Mei

Self-assembly plays an essential role in many natural processes, involving the formation and evolution of living or non-living structures, and shows potential applications in many emerging domains.

Multiagent Systems Distributed, Parallel, and Cluster Computing Robotics

Precise Learning of Source Code Contextual Semantics via Hierarchical Dependence Structure and Graph Attention Networks

no code implementations20 Nov 2021 Zhehao Zhao, Bo Yang, Ge Li, Huai Liu, Zhi Jin

Based on that, we also designed a neural network that depends on the graph attention mechanism. Specifically, we introduced the syntactic structural of the basic block, i. e., its corresponding AST, in source code model to provide sufficient information and fill the gap.

Feature Engineering Graph Attention

STVGFormer: Spatio-Temporal Video Grounding with Static-Dynamic Cross-Modal Understanding

no code implementations6 Jul 2022 Zihang Lin, Chaolei Tan, Jian-Fang Hu, Zhi Jin, Tiancai Ye, Wei-Shi Zheng

The static branch performs cross-modal understanding in a single frame and learns to localize the target object spatially according to intra-frame visual cues like object appearances.

Spatio-Temporal Video Grounding Video Grounding

What does Transformer learn about source code?

no code implementations18 Jul 2022 Kechi Zhang, Ge Li, Zhi Jin

In the field of source code processing, the transformer-based representation models have shown great powerfulness and have achieved state-of-the-art (SOTA) performance in many tasks.

Variable misuse

Antecedent Predictions Are More Important Than You Think: An Effective Method for Tree-Based Code Generation

no code implementations22 Aug 2022 Yihong Dong, Ge Li, Xue Jiang, Zhi Jin

To evaluate the effectiveness of our proposed loss, we implement and train an Antecedent Prioritized Tree-based code generation model called APT.

Code Generation Position

Poison Attack and Defense on Deep Source Code Processing Models

no code implementations31 Oct 2022 Jia Li, Zhuo Li, Huangzhao Zhang, Ge Li, Zhi Jin, Xing Hu, Xin Xia

The attackers aim to inject insidious backdoors into models by poisoning the training data with poison samples.

Clone Detection Code Repair +1

Self-Edit: Fault-Aware Code Editor for Code Generation

no code implementations6 May 2023 Kechi Zhang, Zhuo Li, Jia Li, Ge Li, Zhi Jin

Inspired by the process of human programming, we propose a generate-and-edit approach named Self-Edit that utilizes execution results of the generated code from LLMs to improve the code quality on the competitive programming task.

Code Generation

Structured Chain-of-Thought Prompting for Code Generation

no code implementations11 May 2023 Jia Li, Ge Li, Yongmin Li, Zhi Jin

In this paper, we propose Structured CoTs (SCoTs) and present a novel prompting technique for code generation, named SCoT prompting.

Code Generation Text Generation

Collaborative Static and Dynamic Vision-Language Streams for Spatio-Temporal Video Grounding

no code implementations CVPR 2023 Zihang Lin, Chaolei Tan, Jian-Fang Hu, Zhi Jin, Tiancai Ye, Wei-Shi Zheng

The static stream performs cross-modal understanding in a single frame and learns to attend to the target object spatially according to intra-frame visual cues like object appearances.

Object Spatio-Temporal Video Grounding +1

PlugMed: Improving Specificity in Patient-Centered Medical Dialogue Generation using In-Context Learning

no code implementations19 May 2023 Chengfeng Dou, Zhi Jin, Wenping Jiao, Haiyan Zhao, Zhenwei Tao, Yongqiang Zhao

PlugMed is equipped with two modules, the prompt generation (PG) module and the response ranking (RR) module, to enhances LLMs' dialogue strategies for improving the specificity of the dialogue.

Common Sense Reasoning Dialogue Generation +2

EvEval: A Comprehensive Evaluation of Event Semantics for Large Language Models

no code implementations24 May 2023 Zhengwei Tao, Zhi Jin, Xiaoying Bai, Haiyan Zhao, Yanlin Feng, Jia Li, Wenpeng Hu

In this paper, we propose an overarching framework for event semantic processing, encompassing understanding, reasoning, and prediction, along with their fine-grained aspects.

Low-Light Enhancement in the Frequency Domain

no code implementations29 Jun 2023 Hao Chen, Zhi Jin

Hence, in this work, we propose a novel residual recurrent multi-wavelet convolutional neural network R2-MWCNN learned in the frequency domain that can simultaneously increase the image contrast and reduce noise signals well.

Image Enhancement object-detection +1

Brighten-and-Colorize: A Decoupled Network for Customized Low-Light Image Enhancement

no code implementations6 Aug 2023 Chenxi Wang, Zhi Jin

The colorization sub-task is accomplished by regarding the chrominance of the low-light image as color guidance like the user-guide image colorization.

Colorization Image Colorization +2

PACE: Improving Prompt with Actor-Critic Editing for Large Language Model

no code implementations19 Aug 2023 Yihong Dong, Kangcheng Luo, Xue Jiang, Zhi Jin, Ge Li

Large language models (LLMs) have showcased remarkable potential across various tasks by conditioning on prompts.

Language Modelling Large Language Model

EditSum: A Retrieve-and-Edit Framework for Source Code Summarization

no code implementations26 Aug 2023 Jia Li, Yongmin Li, Ge Li, Xing Hu, Xin Xia, Zhi Jin

Besides the patternized words, a code summary also contains important keywords, which are the key to reflecting the functionality of the code.

Code Summarization Informativeness +1

Large Language Model-Aware In-Context Learning for Code Generation

no code implementations15 Oct 2023 Ge Li, Chongyang Tao, Jia Li, Huangzhao Zhang, Fang Liu, Zhi Jin

Large language models (LLMs) have shown impressive in-context learning (ICL) ability in code generation.

Code Generation Contrastive Learning +3

Enhancing the Spatial Awareness Capability of Multi-Modal Large Language Model

no code implementations31 Oct 2023 Yongqiang Zhao, Zhenyu Li, Zhi Jin, Feng Zhang, Haiyan Zhao, Chengfeng Dou, Zhengwei Tao, Xinhai Xu, Donghong Liu

The Multi-Modal Large Language Model (MLLM) refers to an extension of the Large Language Model (LLM) equipped with the capability to receive and infer multi-modal data.

Autonomous Driving Language Modelling +1

ChatCoder: Chat-based Refine Requirement Improves LLMs' Code Generation

no code implementations1 Nov 2023 Zejun Wang, Jia Li, Ge Li, Zhi Jin

To help human users refine their requirements and improve large language models' code generation performances, we propose ChatCoder: a method to refine the requirements via chatting with large language models.

Code Generation

Integrating Physician Diagnostic Logic into Large Language Models: Preference Learning from Process Feedback

no code implementations11 Jan 2024 Chengfeng Dou, Zhi Jin, Wenpin Jiao, Haiyan Zhao, Yongqiang Zhao, Zhenwei Tao

The use of large language models in medical dialogue generation has garnered significant attention, with a focus on improving response quality and fluency.

Dialogue Generation

DevEval: Evaluating Code Generation in Practical Software Projects

no code implementations12 Jan 2024 Jia Li, Ge Li, YunFei Zhao, Yongmin Li, Zhi Jin, Hao Zhu, Huanyu Liu, Kaibo Liu, Lecheng Wang, Zheng Fang, Lanshen Wang, Jiazheng Ding, Xuanming Zhang, Yihong Dong, Yuqi Zhu, Bin Gu, Mengfei Yang

Compared to previous benchmarks, DevEval aligns to practical projects in multiple dimensions, e. g., real program distributions, sufficient dependencies, and enough-scale project contexts.

Code Generation

SEED: Customize Large Language Models with Sample-Efficient Adaptation for Code Generation

no code implementations29 Feb 2024 Xue Jiang, Yihong Dong, Zhi Jin, Ge Li

Specifically, SEED involves identifying error code generated by LLMs, employing Self-revise for code revision, optimizing the model with revised code, and iteratively adapting the process for continuous improvement.

Code Generation

Cannot find the paper you are looking for? You can Submit a new open access paper.