Context-aware Natural Language Generation with Recurrent Neural Networks

29 Nov 2016Jian TangYifan YangSam CartonMing ZhangQiaozhu Mei

This paper studied generating natural languages at particular contexts or situations. We proposed two novel approaches which encode the contexts into a continuous semantic representation and then decode the semantic representation into text sequences with recurrent neural networks... (read more)

PDF Abstract


No code implementations yet. Submit your code now

Results from the Paper

  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.