A simple neural network module for relational reasoning

Relational reasoning is a central component of generally intelligent behavior, but has proven difficult for neural networks to learn. In this paper we describe how to use Relation Networks (RNs) as a simple plug-and-play module to solve problems that fundamentally hinge on relational reasoning. We tested RN-augmented networks on three tasks: visual question answering using a challenging dataset called CLEVR, on which we achieve state-of-the-art, super-human performance; text-based question answering using the bAbI suite of tasks; and complex reasoning about dynamic physical systems. Then, using a curated dataset called Sort-of-CLEVR we show that powerful convolutional networks do not have a general capacity to solve relational questions, but can gain this capacity when augmented with RNs. Our work shows how a deep learning architecture equipped with an RN module can implicitly discover and learn to reason about entities and their relations.

PDF Abstract NeurIPS 2017 PDF NeurIPS 2017 Abstract
Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Visual Question Answering (VQA) CLEVR CNN + LSTM + RN Accuracy 95.50 # 13
Image Retrieval with Multi-Modal Query Fashion200k Relationship Recall@1 13 # 5
Recall@10 40.5 # 5
Recall@50 62.4 # 5

Methods


No methods listed for this paper. Add relevant methods here