Story Generation
74 papers with code • 5 benchmarks • 7 datasets
Story generation is the task of automatically generating a coherent narrative, often from a set of premises or a brief summary.
Datasets
Most implemented papers
A Skeleton-Based Model for Promoting Coherence Among Sentences in Narrative Story Generation
Compared to the state-of-the-art models, our skeleton-based model can generate significantly more coherent text according to human evaluation and automatic evaluation.
Controllable Neural Story Plot Generation via Reward Shaping
Language-modeling--based approaches to story plot generation attempt to construct a plot by sampling from a language model (LM) to predict the next character, word, or sentence to add to the story.
Plan, Write, and Revise: an Interactive System for Open-Domain Story Generation
We compare different varieties of interaction in story-writing, story-planning, and diversity controls under time constraints, and show that increased types of human collaboration at both planning and writing stages results in a 10-50% improvement in story quality as compared to less interactive baselines.
Informative Visual Storytelling with Cross-modal Rules
To solve this problem, we propose a method to mine the cross-modal rules to help the model infer these informative concepts given certain visual input.
Improving Neural Story Generation by Targeted Common Sense Grounding
Stories generated with neural language models have shown promise in grammatical and stylistic consistency.
Story Realization: Expanding Plot Events into Sentences
Neural network based approaches to automated story plot generation attempt to learn how to generate novel plots from a corpus of natural language plot summaries.
Do Massively Pretrained Language Models Make Better Storytellers?
Large neural language models trained on massive amounts of text have emerged as a formidable strategy for Natural Language Understanding tasks.
Knowledge-Enriched Visual Storytelling
This paper introduces KG-Story, a three-stage framework that allows the story generation model to take advantage of external Knowledge Graphs to produce interesting stories.
A Knowledge-Enhanced Pretraining Model for Commonsense Story Generation
To further capture the causal and temporal dependencies between the sentences in a reasonable story, we employ multi-task learning which combines a discriminative objective to distinguish true and fake stories during fine-tuning.
Semantics of the Unwritten: The Effect of End of Paragraph and Sequence Tokens on Text Generation with GPT2
Experimental results show that the Chinese GPT2 can generate better essay endings with \eop.