Linguistic Knowledge as Memory for Recurrent Neural Networks

7 Mar 2017Bhuwan Dhingra • Zhilin Yang • William W. Cohen • Ruslan Salakhutdinov

Specifically, external knowledge is used to augment a sequence with typed edges between arbitrarily distant elements, and the resulting graph is decomposed into directed acyclic subgraphs. We introduce a model that encodes such graphs as explicit memory in recurrent neural networks, and use it to model coreference relations in text. On the bAbi QA tasks, our model solves 15 out of the 20 tasks with only 1000 training examples per task.

Full paper

Evaluation


Task Dataset Model Metric name Metric value Global rank Compare
Question Answering CNN / Daily Mail GA+MAGE (32) CNN 78.6 # 1