Abstract
The practical implementation of Deep Learning methods for chemistry applications relies on encoding chemical structures into machine-readable formats that can be efficiently processed by computational tools. To this end, One Hot Encoding (OHE) is an established representation of alphanumeric categorical data into expanded numerical matrices. We have developed an embedded alternative to OHE that encodes discrete alphanumeric tokens of an N-sized alphabet into a few real numbers that constitute a simpler matrix representation of chemical structures. The implementation of this embedded One Hot Encoding (eOHE) in training machine learning models achieves comparable results to OHE in model accuracy and robustness, while significantly reducing the use of computational resources. Our benchmarks across three molecular representations (SMILES, DeepSMILES, and SELFIES) and three different molecular databases (ZINC, QM9, and GDB-13) for Variational Autoencoders (VAE) and Recurrent Neural Networks (RNN) show that using eOHE reduces RAM memory usage by up to 50%, while increasing disk Memory Reduction Efficiency (MRE) to 80% average. This encoding method opens new avenues for data representation into embedded formats that promote energy efficiency, and scalable computing in resource-constrained devices, or in scenarios with limited computing resources. The application of eOHE impacts not only the chemistry field, but other disciplines that rely on the use of OHE.
Supplementary materials
Title
Electronic Supplementary Information: Embedded machine-readable molecular representation for resource-efficient deep learning applications
Description
Electronic Supplementary Information
Actions