CodeGen is an autoregressive transformers with next-token prediction language modeling as the learning objective trained on a natural language corpus and programming language data curated from GitHub.
Source: CodeGen: An Open Large Language Model for Code with Multi-Turn Program SynthesisPaper | Code | Results | Date | Stars |
---|
Task | Papers | Share |
---|---|---|
Code Generation | 19 | 41.30% |
Language Modelling | 5 | 10.87% |
Program Synthesis | 4 | 8.70% |
Language Modeling | 3 | 6.52% |
Memorization | 2 | 4.35% |
In-Context Learning | 2 | 4.35% |
Large Language Model | 2 | 4.35% |
Benchmarking | 2 | 4.35% |
Vulnerability Detection | 1 | 2.17% |
Component | Type |
|
---|---|---|
🤖 No Components Found | You can add them if they exist; e.g. Mask R-CNN uses RoIAlign |