Molecular Graph Enhanced Transformer for Retrosynthesis Prediction

25 Sep 2019  ·  Kelong Mao, Peilin Zhao, Tingyang Xu, Yu Rong, Xi Xiao, Junzhou Huang ·

With massive possible synthetic routes in chemistry, retrosynthesis prediction is still a challenge for researchers. Recently, retrosynthesis prediction is formulated as a Machine Translation (MT) task. Namely, since each molecule can be represented as a Simplified Molecular-Input Line-Entry System (SMILES) string, the process of synthesis is analogized to a process of language translation from reactants to products. However, the MT models that applied on SMILES data usually ignore the information of natural atomic connections and the topology of molecules. In this paper, we propose a Graph Enhanced Transformer (GET) framework, which adopts both the sequential and graphical information of molecules. Four different GET designs are proposed, which fuse the SMILES representations with atom embedding learned from our improved Graph Neural Network (GNN). Empirical results show that our model significantly outperforms the Transformer model in test accuracy.

PDF Abstract

Datasets


Results from the Paper


Task Dataset Model Metric Name Metric Value Global Rank Benchmark
Single-step retrosynthesis USPTO-50k GET-LT1 Top-1 accuracy 44.9 # 17
Top-3 accuracy 58.8 # 15
Top-5 accuracy 62.4 # 16
Top-10 accuracy 65.9 # 16

Methods