Code Documentation Generation is a supervised task where a code function is the input to the model, and the model generates the documentation for this function.
Description from: CodeTrans: Towards Cracking the Language of Silicone's Code Through Self-Supervised Deep Learning and High Performance Computing
Results show that CodeBERT achieves state-of-the-art performance on both natural language code search and code documentation generation tasks.
Simultaneously, the transformer model, especially its combination with transfer learning, has been proven to be a powerful technique for natural language processing tasks.
API SEQUENCE RECOMMENDATION CODE COMMENT GENERATION CONTEXTUAL EMBEDDING FOR SOURCE CODE GIT COMMIT MESSAGE GENERATION MULTI-TASK LEARNING PROGRAM SYNTHESIS SOURCE CODE SUMMARIZATION