CTRL is conditional transformer language model, trained to condition on control codes that govern style, content, and task-specific behavior. Control codes were derived from structure that naturally co-occurs with raw text, preserving the advantages of unsupervised learning while providing more explicit control over text generation. These codes also allow CTRL to predict which parts of the training data are most likely given a sequence
Source: CTRL: A Conditional Transformer Language Model for Controllable GenerationPaper | Code | Results | Date | Stars |
---|
Task | Papers | Share |
---|---|---|
Language Modelling | 5 | 12.20% |
Text Generation | 4 | 9.76% |
Self-Supervised Learning | 3 | 7.32% |
Backdoor Attack | 2 | 4.88% |
Graph Representation Learning | 1 | 2.44% |
Click-Through Rate Prediction | 1 | 2.44% |
Recommendation Systems | 1 | 2.44% |
Large Language Model | 1 | 2.44% |
3D Object Detection | 1 | 2.44% |