8 papers with code • 1 benchmarks • 4 datasets
Text-to-Code Generation is a task where we can generate code based on the natural language description.
CodeT5: Identifier-aware Unified Pre-trained Encoder-Decoder Models for Code Understanding and Generation
We present CodeT5, a unified pre-trained encoder-decoder Transformer model that better leverages the code semantics conveyed from the developer-assigned identifiers.
This paper addresses the problem of code generation, where the goal is to generate target code given source code in a different language or a natural language description.
We present PanGu-Coder, a pretrained decoder-only language model adopting the PanGu-Alpha architecture for text-to-code generation, i. e. the synthesis of programming language solutions given a natural language problem description.
In the Copy Phase, a binary classifier is employed to determine and mask the pseudocode tokens that can be directly copied into the code.