Machine Comprehension by Text-to-Text Neural Question Generation

WS 2017 Xingdi YuanTong WangCaglar GulcehreAlessandro SordoniPhilip BachmanSandeep SubramanianSaizheng ZhangAdam Trischler

We propose a recurrent neural model that generates natural-language questions from documents, conditioned on answers. We show how to train the model using a combination of supervised and reinforcement learning... (read more)

PDF Abstract

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.