GEN: Highly Efficient SMILES Explorer Using Autodidactic Generative Examination Networks

12 September 2019, Version 1
This content is a preprint and has not undergone peer review at the time of posting.


Recurrent neural networks have been widely used to generate millions of de novo molecules in a known chemical space. These deep generative models are typically setup with LSTM or GRU units and trained with canonical SMILES. In this study, we introduce a new robust architecture, Generative Examination Network GEN, based on bidirectional RNNs with concatenated sub-models to learn and generate molecular SMILES within a trained target space. GENs autonomously learn the target space in a few epochs while being subjected to an independent online examination to measure the quality of the generated set. Here we have used online statistical quality control (SQC) on the percentage of valid molecular SMILES as examination measure to select the earliest available stable model weights. Very high levels of valid SMILES (95-98%) can be generated using multiple parallel encoding layers in combination with SMILES augmentation using unrestricted SMILES randomization. Our architecture combines an excellent novelty rate (85-90%) while generating SMILES with strong conservation of the property space (95-99%). Our flexible examination mechanism is open to other quality criteria.


SMILES string representation
assessment measures
quality control mechanisms


Comments are not moderated before they are posted, but they can be removed by the site moderators if they are found to be in contravention of our Commenting Policy [opens in a new tab] - please read this policy before you post. Comments should be used for scholarly discussion of the content in question. You can find more information about how to use the commenting feature here [opens in a new tab] .
This site is protected by reCAPTCHA and the Google Privacy Policy [opens in a new tab] and Terms of Service [opens in a new tab] apply.