Toward Subgraph-Guided Knowledge Graph Question Generation with Graph Neural Networks

13 Apr 2020  ·  Yu Chen, Lingfei Wu, Mohammed J. Zaki ·

Knowledge graph (KG) question generation (QG) aims to generate natural language questions from KGs and target answers. Previous works mostly focus on a simple setting which is to generate questions from a single KG triple. In this work, we focus on a more realistic setting where we aim to generate questions from a KG subgraph and target answers. In addition, most of previous works built on either RNN-based or Transformer based models to encode a linearized KG sugraph, which totally discards the explicit structure information of a KG subgraph. To address this issue, we propose to apply a bidirectional Graph2Seq model to encode the KG subgraph. Furthermore, we enhance our RNN decoder with node-level copying mechanism to allow directly copying node attributes from the KG subgraph to the output question. Both automatic and human evaluation results demonstrate that our model achieves new state-of-the-art scores, outperforming existing methods by a significant margin on two QG benchmarks. Experimental results also show that our QG model can consistently benefit the Question Answering (QA) task as a mean of data augmentation.

PDF Abstract
Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
KG-to-Text Generation PathQuestion SOTA-NPT BLEU 61.48 # 3
METEOR 44.57 # 5
ROUGE 77.72 # 3
KG-to-Text Generation WebQuestions SOTA-NPT BLEU 29.45 # 3
METEOR 30.96 # 4
ROUGE 55.45 # 2

Methods


No methods listed for this paper. Add relevant methods here