no code implementations • 2 Oct 2020 • Ruud Van Deursen, Igor V. Tetko, Guillaume Godin
Interestingly, we did not see degradation of the performance of Transformer-CNN models when the stereochemical information was not present in SMILES.
1 code implementation • 5 Mar 2020 • Igor V. Tetko, Pavel Karpov, Ruud Van Deursen, Guillaume Godin
We investigated the effect of different training scenarios on predicting the (retro)synthesis of chemical compounds using a text-like representation of chemical reactions (SMILES) and Natural Language Processing neural network Transformer architecture.
Ranked #7 on Single-step retrosynthesis on USPTO-50k
no code implementations • 29 Oct 2019 • Fabio Capela, Vincent Nouchi, Ruud Van Deursen, Igor V. Tetko, Guillaume Godin
Prediction of molecular properties, including physico-chemical properties, is a challenging task in chemistry.
no code implementations • 24 Sep 2019 • Ruud van Deursen, Guillaume Godin
The generative models are evaluated for overall performance and for reconstruction of the property space.
no code implementations • 10 Sep 2019 • Ruud van Deursen, Peter Ertl, Igor V. Tetko, Guillaume Godin
In this study, we introduce a new robust architecture, Generative Examination Networks GEN, based on bidirectional RNNs with concatenated sub-models to learn and generate molecular SMILES with a trained target space.