CTRL: A Conditional Transformer Language Model for Controllable Generation

Preprint 2019 Nitish Shirish KeskarBryan McCannLav R. VarshneyCaiming XiongRichard Socher

Large-scale language models show promising text generation capabilities, but users cannot easily control particular aspects of the generated text. We release CTRL, a 1.6 billion-parameter conditional transformer language model, trained to condition on control codes that govern style, content, and task-specific behavior... (read more)

PDF Abstract

Evaluation results from the paper

  Submit results from this paper to get state-of-the-art GitHub badges and help community compare results to other papers.