Generating Sequences With Recurrent Neural Networks

4 Aug 2013  ·  Alex Graves ·

This paper shows how Long Short-term Memory recurrent neural networks can be used to generate complex sequences with long-range structure, simply by predicting one data point at a time. The approach is demonstrated for text (where the data are discrete) and online handwriting (where the data are real-valued). It is then extended to handwriting synthesis by allowing the network to condition its predictions on a text sequence. The resulting system is able to generate highly realistic cursive handwriting in a wide variety of styles.

PDF Abstract

Datasets


Results from the Paper


Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Language Modelling enwik8 LSTM (7 layers) Bit per Character (BPC) 1.67 # 41

Methods


No methods listed for this paper. Add relevant methods here