Text generation is the task of generating text with the goal of appearing indistinguishable to human-written text.
|Trend||Dataset||Best Method||Paper title||Paper||Code||Compare|
Additionally, these models are typically trained via maxi- mum likelihood and teacher forcing.
This paper shows how Long Short-term Memory recurrent neural networks can be used to generate complex sequences with long-range structure, simply by predicting one data point at a time.
We observe that our method consistently outperforms BS and previously proposed techniques for diverse decoding from neural sequence models.
In this work, we introduce a model and beam-search training scheme, based on the work of Daume III and Marcu (2005), that extends seq2seq to learn global sequence scores.
#14 best model for Machine Translation on IWSLT2015 German-English
As a new way of training generative models, Generative Adversarial Nets (GAN) that uses a discriminative model to guide the training of the generative model has enjoyed considerable success in generating real-valued data.
#2 best model for Text Generation on EMNLP2017 WMT
We introduce Texar, an open-source toolkit aiming to support the broad set of text generation tasks that transforms any inputs into natural language, such as machine translation, summarization, dialog, content manipulation, and so forth.
End-to-end models for goal-orientated dialogue are challenging to train, because linguistic and strategic aspects are entangled in latent state vectors.
We introduce Texygen, a benchmarking platform to support research on open-domain text generation models.