TeaNet: universal neural network interatomic potential inspired by iterative electronic relaxations

2 Dec 2019  ·  So Takamoto, Satoshi Izumi, Ju Li ·

A universal interatomic potential for an arbitrary set of chemical elements is urgently needed in computational materials science. Graph convolution neural network (GCN) has rich expressive power, but previously was mainly employed to transport scalars and vectors, not rank $\ge 2$ tensors. As classic interatomic potentials were inspired by tight-binding electronic relaxation framework, we want to represent this iterative propagation of rank $\ge 2$ tensor information by GCN. Here we propose an architecture called the tensor embedded atom network (TeaNet) where angular interaction is translated into graph convolution through the incorporation of Euclidean tensors, vectors and scalars. By applying the residual network (ResNet) architecture and training with recurrent GCN weights initialization, a much deeper (16 layers) GCN was constructed, whose flow is similar to an iterative electronic relaxation. Our traning dataset is generated by density functional theory calculation of mostly chemically and structurally randomized configurations. We demonstrate that arbitrary structures and reactions involving the first 18 elements on the periodic table (H to Ar) can be realized satisfactorily by TeaNet, including C-H molecular structures, metals, amorphous SiO${}_2$, and water, showing surprisingly good performance (energy mean absolute error 19 meV/atom) and robustness for arbitrary chemistries involving elements from H to Ar.

PDF Abstract
No code implementations yet. Submit your code now

Tasks


Datasets


Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods