Simple GNN Regularisation for 3D Molecular Property Prediction and Beyond

Graph Neural Networks (GNNs) have been proven effective across a wide range of molecular property prediction and structured learning problems. However, their efficiency is known to be hindered by practical challenges such as oversmoothing. We introduce “Noisy Nodes”, a very simple technique for improved training of GNNs, in which we corrupt the input graph with noise, and add a noise correcting node-level loss. Adding noise helps overfitting, and the noise correction loss helps ameliorate oversmoothing by encouraging diverse node latents. Our regulariser applies well-studied methods in simple, straightforward ways which allows even generic architectures not designed for quantum chemistry to achieve state of the art results. We also demonstrate the effectiveness of Noisy Nodes with non-spatial architectures on Open Graph Benchmark (OGB) datasets. Our results suggest Noisy Nodes can serve as a complementary building block in the GNN toolkit for 3D molecular property prediction and beyond.

PDF Abstract
Task Dataset Model Metric Name Metric Value Global Rank Benchmark
Initial Structure to Relaxed Energy (IS2RE), Direct OC20 Noisy Nodes Validation Mean Energy MAE 0.48 # 7
Test Mean Energy MAE 0.4728 # 8
Test Mean EWT (%) 6.5 # 5

Methods


No methods listed for this paper. Add relevant methods here