Discourse Embellishment Using a Deep Encoder-Decoder Network

WS 2018  ·  Leonid Berov, Kai Standvoss ·

We suggest a new NLG task in the context of the discourse generation pipeline of computational storytelling systems. This task, textual embellishment, is defined by taking a text as input and generating a semantically equivalent output with increased lexical and syntactic complexity. Ideally, this would allow the authors of computational storytellers to implement just lightweight NLG systems and use a domain-independent embellishment module to translate its output into more literary text. We present promising first results on this task using LSTM Encoder-Decoder networks trained on the WikiLarge dataset. Furthermore, we introduce "Compiled Computer Tales", a corpus of computationally generated stories, that can be used to test the capabilities of embellishment algorithms.

PDF Abstract WS 2018 PDF WS 2018 Abstract

Datasets


Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods