Keeping Notes: Conditional Natural Language Generation with a Scratchpad Encoder

ACL 2019 Ryan BenmalekMadian KhabsaSuma DesuClaire CardieMichele Banko

We introduce the Scratchpad Mechanism, a novel addition to the sequence-to-sequence (seq2seq) neural network architecture and demonstrate its effectiveness in improving the overall fluency of seq2seq models for natural language generation tasks. By enabling the decoder at each time step to write to all of the encoder output layers, Scratchpad can employ the encoder as a {``}scratchpad{''} memory to keep track of what has been generated so far and thereby guide future generation... (read more)

PDF Abstract

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods used in the Paper