Transferable Multi-level Attention Neural Network for Accurate Prediction of Quantum Chemistry Properties via Multi-task Learning
Preprints are manuscripts made publicly available before they have been submitted for formal peer review and publication. They might contain new research findings or data. Preprints can be a draft or final version of an author's research but must not have been accepted for publication at the time of submission.
The development of efficient models for predicting specific properties through machine learning is of great importance for the innovation of chemistry and material science. However, predicting electronic structure properties like frontier molecular orbital HOMO and LUMO energy levels and their HOMO-LUMO gaps from the small-sized molecule data to larger molecules remains a challenge. Here we develop a multi-level attention strategy that enables chemical interpretable insights to be fused into multi-task learning of up to 110,000 records of data in QM9 for random split evaluation. The good transferability for predicting larger molecules outside the training set is demonstrated in both QM9 and Alchemy datasets. The efficient and accurate prediction of 12 properties including dipole moment, HOMO, and Gibbs free energy within chemical accuracy is achieved by using our specifically designed interpretable multi-level attention neural network, named as DeepMoleNet. Remarkably, the present multi-task deep learning model adopts the atom-centered symmetry functions (ACSFs) descriptor as one of the prediction targets, rather than using ACSFs as input in the conventional way. The proposed multi-level attention neural network is applicable to high-throughput screening of numerous chemical species to accelerate rational designs of drug, material, and chemical reactions.