Paper

Context-aware Natural Language Generation with Recurrent Neural Networks

This paper studied generating natural languages at particular contexts or situations. We proposed two novel approaches which encode the contexts into a continuous semantic representation and then decode the semantic representation into text sequences with recurrent neural networks... (read more)

Results in Papers With Code
(↓ scroll down to see all results)