Neural Message Passing for Quantum Chemistry

Supervised learning on molecules has incredible potential to be useful in chemistry, drug discovery, and materials science. Luckily, several promising and closely related neural network models invariant to molecular symmetries have already been described in the literature. These models learn a message passing algorithm and aggregation procedure to compute a function of their entire input graph. At this point, the next step is to find a particularly effective variant of this general approach and apply it to chemical prediction benchmarks until we either solve them or reach the limits of the approach. In this paper, we reformulate existing models into a single common framework we call Message Passing Neural Networks (MPNNs) and explore additional novel variations within this framework. Using MPNNs we demonstrate state of the art results on an important molecular property prediction benchmark; these results are strong enough that we believe future work should focus on datasets with larger molecules or more accurate ground truth labels.

PDF Abstract ICML 2017 PDF ICML 2017 Abstract

Results from the Paper


Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Node Classification CiteSeer with Public Split: fixed 20 nodes per class MPNN Accuracy 64.0 # 39
Graph Regression Lipophilicity MPNN RMSE 0.719 # 5
Formation Energy QM9 MPNN MAE 0.49 # 17
Drug Discovery QM9 MPNNs Error ratio 0.68 # 9
Graph Regression ZINC 100k MPNN MAE 0.288 # 5
Graph Regression ZINC-500k MPNN (sum) MAE 0.145 # 22
Graph Regression ZINC-500k MPNN (max) MAE 0.252 # 25

Methods