Transferable Multi-level Attention Neural Network for Accurate Prediction of Quantum Chemistry Properties via Multi-task Learning

30 Jun 2020  ·  Liqiang Lin, Qingqing Jia, Zheng Cheng, Yanyan Jiang, Yanwen Guo, Jing Ma ·

The development of efficient models for predicting specific properties through machine learning is of great importance for the innovation of chemistry and material science. However, predicting electronic structure properties like frontier molecular orbital HOMO and LUMO energy levels and their HOMO-LUMO gaps from the small-sized molecule data to larger molecules remains a challenge. Here we develop a multi-level attention strategy that enables chemical interpretable insights to be fused into multi-task learning of up to 110,000 records of data in QM9 for random split evaluation. The good transferability for predicting larger molecules outside the training set is demonstrated in both QM9 and Alchemy datasets. The efficient and accurate prediction of 12 properties including dipole moment, HOMO, and Gibbs free energy within chemical accuracy is achieved by using our specifically designed interpretable multi-level attention neural network, named as DeepMoleNet. Remarkably, the present multi-task deep learning model adopts the atom-centered symmetry functions (ACSFs) descriptor as one of the prediction targets, rather than using ACSFs as input in the conventional way. The proposed multi-level attention neural network is applicable to high-throughput screening of numerous chemical species to accelerate rational designs of drug, material, and chemical reactions.

PDF Abstract

Datasets


Results from the Paper


Task Dataset Model Metric Name Metric Value Global Rank Benchmark
Formation Energy QM9 DeepMoleNet MAE 0.141 # 7
Drug Discovery QM9 DeepMoleNet Error ratio 0.531 # 8

Methods


No methods listed for this paper. Add relevant methods here