Search Results for author: Guillaume Godin

Found 7 papers, 2 papers with code

Beyond Chemical 1D knowledge using Transformers

no code implementations2 Oct 2020 Ruud Van Deursen, Igor V. Tetko, Guillaume Godin

Interestingly, we did not see degradation of the performance of Transformer-CNN models when the stereochemical information was not present in SMILES.

State-of-the-Art Augmented NLP Transformer models for direct and single-step retrosynthesis

1 code implementation5 Mar 2020 Igor V. Tetko, Pavel Karpov, Ruud Van Deursen, Guillaume Godin

We investigated the effect of different training scenarios on predicting the (retro)synthesis of chemical compounds using a text-like representation of chemical reactions (SMILES) and Natural Language Processing neural network Transformer architecture.

Data Augmentation Memorization +2

Transformer-CNN: Fast and Reliable tool for QSAR

1 code implementation21 Oct 2019 Pavel Karpov, Guillaume Godin, Igor V. Tetko

That both the augmentation and transfer learning are based on embeddings allows the method to provide good results for small datasets.

Transfer Learning

Deep Generative Model for Sparse Graphs using Text-Based Learning with Augmentation in Generative Examination Networks

no code implementations24 Sep 2019 Ruud van Deursen, Guillaume Godin

The generative models are evaluated for overall performance and for reconstruction of the property space.

GEN: Highly Efficient SMILES Explorer Using Autodidactic Generative Examination Networks

no code implementations10 Sep 2019 Ruud van Deursen, Peter Ertl, Igor V. Tetko, Guillaume Godin

In this study, we introduce a new robust architecture, Generative Examination Networks GEN, based on bidirectional RNNs with concatenated sub-models to learn and generate molecular SMILES with a trained target space.

valid

Cannot find the paper you are looking for? You can Submit a new open access paper.