no code implementations • 2 Oct 2020 • Ruud Van Deursen, Igor V. Tetko, Guillaume Godin
Interestingly, we did not see degradation of the performance of Transformer-CNN models when the stereochemical information was not present in SMILES.
1 code implementation • 5 Mar 2020 • Igor V. Tetko, Pavel Karpov, Ruud Van Deursen, Guillaume Godin
We investigated the effect of different training scenarios on predicting the (retro)synthesis of chemical compounds using a text-like representation of chemical reactions (SMILES) and Natural Language Processing neural network Transformer architecture.
Ranked #6 on Single-step retrosynthesis on USPTO-50k
no code implementations • 29 Oct 2019 • Fabio Capela, Vincent Nouchi, Ruud Van Deursen, Igor V. Tetko, Guillaume Godin
Prediction of molecular properties, including physico-chemical properties, is a challenging task in chemistry.
1 code implementation • 21 Oct 2019 • Pavel Karpov, Guillaume Godin, Igor V. Tetko
That both the augmentation and transfer learning are based on embeddings allows the method to provide good results for small datasets.
no code implementations • 24 Sep 2019 • Ruud van Deursen, Guillaume Godin
The generative models are evaluated for overall performance and for reconstruction of the property space.
no code implementations • 10 Sep 2019 • Ruud van Deursen, Peter Ertl, Igor V. Tetko, Guillaume Godin
In this study, we introduce a new robust architecture, Generative Examination Networks GEN, based on bidirectional RNNs with concatenated sub-models to learn and generate molecular SMILES with a trained target space.
no code implementations • 11 Dec 2018 • Talia B. Kimber, Sebastian Engelke, Igor V. Tetko, Eric Bruno, Guillaume Godin
In our study, we demonstrate the synergy effect between convolutional neural networks and the multiplicity of SMILES.