Code Comment Generation
4 papers with code • 1 benchmarks • 2 datasets
Latest papers with no code
APIContext2Com: Code Comment Generation by Incorporating Pre-Defined API Documentation
The API context includes the definition and description of the pre-defined APIs that are used within the code snippets.
LAMNER: Code Comment Generation Using Character Language Model and Named Entity Recognition
Although researchers have been studying multiple ways to generate code comments automatically, previous work mainly considers representing a code token in its entirety semantics form only (e. g., a language model is used to learn the semantics of a code token), and additional code properties such as the tree structure of a code are included as an auxiliary input to the model.
InferCode: Self-Supervised Learning of Code Representations by Predicting Subtrees
We trained an InferCode model instance using the Tree-based CNN as the encoder of a large set of Java code and applied it to downstream unsupervised tasks such as code clustering, code clone detection, cross-language code search or reused under a transfer learning scheme to continue training the model weights for supervised tasks such as code classification and method name prediction.
Learning to Represent Programs with Heterogeneous Graphs
To address the information of node and edge types, we bring the idea of heterogeneous graphs to learning on source code and present a new formula of building heterogeneous program graphs from ASTs with additional type information for nodes and edges.
Towards Context-Aware Code Comment Generation
Code comments are vital for software maintenance and comprehension, but many software projects suffer from the lack of meaningful and up-to-date comments in practice.
TAG : Type Auxiliary Guiding for Code Comment Generation
Existing leading code comment generation approaches with the structure-to-sequence framework ignores the type information of the interpretation of the code, e. g., operator, string, etc.