Search Results for author: Zhi Jin

Found 30 papers, 8 papers with code

Massive Self-Assembly in Grid Environments

no code implementations5 Feb 2021 Wenjie Chu, Wei zhang, Haiyan Zhao, Zhi Jin, Hong Mei

Self-assembly plays an essential role in many natural processes, involving the formation and evolution of living or non-living structures, and shows potential applications in many emerging domains.

Multiagent Systems Distributed, Parallel, and Cluster Computing Robotics

Learning to Represent Programs with Heterogeneous Graphs

no code implementations8 Dec 2020 Wenhan Wang, Kechi Zhang, Ge Li, Zhi Jin

To address the information of node and edge types, we bring the idea of heterogeneous graphs to learning on source code and present a new formula of building heterogeneous program graphs from ASTs with additional type information for nodes and edges.

Code Comment Generation

Retrieve and Refine: Exemplar-based Neural Comment Generation

1 code implementation9 Oct 2020 Bolin Wei, Yongmin Li, Ge Li, Xin Xia, Zhi Jin

Inspired by the IR-based and template-based approaches, in this paper, we propose a neural comment generation approach where we use the existing comments of similar code snippets as exemplars to guide comment generation.

Code Comment Generation Information Retrieval +2

Towards Full-line Code Completion with Neural Language Models

no code implementations18 Sep 2020 Wenhan Wang, Sijie Shen, Ge Li, Zhi Jin

In this paper, we take a further step and discuss the probability of directly completing a whole line of code instead of a single token.

Code Completion

Detecting Code Clones with Graph Neural Networkand Flow-Augmented Abstract Syntax Tree

1 code implementation20 Feb 2020 Wenhan Wang, Ge Li, Bo Ma, Xin Xia, Zhi Jin

As far as we have concerned, we are the first to apply graph neural networks on the domain of code clone detection.

Clone Detection

Code Generation as a Dual Task of Code Summarization

3 code implementations NeurIPS 2019 Bolin Wei, Ge Li, Xin Xia, Zhiyi Fu, Zhi Jin

Code summarization (CS) and code generation (CG) are two crucial tasks in the field of automatic software development.

Code Generation Code Summarization +1

A Self-Attentional Neural Architecture for Code Completion with Multi-Task Learning

no code implementations16 Sep 2019 Fang Liu, Ge Li, Bolin Wei, Xin Xia, Zhiyi Fu, Zhi Jin

To enable the knowledge sharing between related tasks, we creatively propose a Multi-Task Learning (MTL) framework to learn two related tasks in code completion jointly.

Code Completion Language Modelling +1

Solving Pictorial Jigsaw Puzzle by Stigmergy-inspired Internet-based Human Collective Intelligence

no code implementations28 Nov 2018 Bo Shen, Wei zhang, Haiyan Zhao, Zhi Jin, Yanhong Wu

And through feedback, each player is provided with personalized feedback information based on the current COG and the player's exploration result, in order to accelerate his/her puzzle-solving process.

Color Image Demosaicking Using a 3-Stage Convolutional Neural Network Structure

1 code implementation7 Oct 2018 Kai Cui, Zhi Jin, Eckehard Steinbach

Color demosaicking (CDM) is a critical first step for the acquisition of high-quality RGB images with single chip cameras.

Demosaicking

Compression of phase-only holograms with JPEG standard and deep learning

no code implementations11 Jun 2018 Shuming Jiao, Zhi Jin, Chenliang Chang, Changyuan Zhou, Wenbin Zou, Xia Li

It is a critical issue to reduce the enormous amount of data in the processing, storage and transmission of a hologram in digital format.

Coupling Distributed and Symbolic Execution for Natural Language Queries

no code implementations ICML 2017 Lili Mou, Zhengdong Lu, Hang Li, Zhi Jin

Building neural networks to query a knowledge base (a table) with natural language is an emerging research topic in deep learning.

Compressing Neural Language Models by Sparse Word Representations

1 code implementation ACL 2016 Yunchuan Chen, Lili Mou, Yan Xu, Ge Li, Zhi Jin

Such approaches are time- and memory-intensive because of the large numbers of parameters for word embeddings and the output layer.

Language Modelling Word Embeddings

How Transferable are Neural Networks in NLP Applications?

no code implementations EMNLP 2016 Lili Mou, Zhao Meng, Rui Yan, Ge Li, Yan Xu, Lu Zhang, Zhi Jin

Transfer learning is aimed to make use of valuable knowledge in a source domain to help model performance in a target domain.

Transfer Learning

Improved Relation Classification by Deep Recurrent Neural Networks with Data Augmentation

no code implementations COLING 2016 Yan Xu, Ran Jia, Lili Mou, Ge Li, Yunchuan Chen, Yangyang Lu, Zhi Jin

However, existing neural networks for relation classification are usually of shallow architectures (e. g., one-layer convolutional neural networks or recurrent networks).

Classification Data Augmentation +2

Backward and Forward Language Modeling for Constrained Sentence Generation

no code implementations21 Dec 2015 Lili Mou, Rui Yan, Ge Li, Lu Zhang, Zhi Jin

Provided a specific word, we use RNNs to generate previous words and future words, either simultaneously or asynchronously, resulting in two model variants.

Language Modelling Machine Translation +2

On End-to-End Program Generation from User Intention by Deep Neural Networks

no code implementations25 Oct 2015 Lili Mou, Rui Men, Ge Li, Lu Zhang, Zhi Jin

This paper envisions an end-to-end program generation scenario using recurrent neural networks (RNNs): Users can express their intention in natural language; an RNN then automatically generates corresponding code in a characterby-by-character fashion.

A Comparative Study on Regularization Strategies for Embedding-based Neural Networks

no code implementations EMNLP 2015 Hao Peng, Lili Mou, Ge Li, Yunchuan Chen, Yangyang Lu, Zhi Jin

This paper aims to compare different regularization strategies to address a common phenomenon, severe overfitting, in embedding-based neural networks for NLP.

Distilling Word Embeddings: An Encoding Approach

no code implementations15 Jun 2015 Lili Mou, Ran Jia, Yan Xu, Ge Li, Lu Zhang, Zhi Jin

Distilling knowledge from a well-trained cumbersome network to a small one has recently become a new research topic, as lightweight neural networks with high performance are particularly in need in various resource-restricted systems.

Word Embeddings

Convolutional Neural Networks over Tree Structures for Programming Language Processing

7 code implementations18 Sep 2014 Lili Mou, Ge Li, Lu Zhang, Tao Wang, Zhi Jin

Programming language processing (similar to natural language processing) is a hot research topic in the field of software engineering; it has also aroused growing interest in the artificial intelligence community.

Building Program Vector Representations for Deep Learning

1 code implementation11 Sep 2014 Lili Mou, Ge Li, Yuxuan Liu, Hao Peng, Zhi Jin, Yan Xu, Lu Zhang

In this pioneering paper, we propose the "coding criterion" to build program vector representations, which are the premise of deep learning for program analysis.

Representation Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.