These are preliminary reports that have not been peer-reviewed. They should not be regarded as conclusive, guide clinical practice/health-related behavior, or be reported in news media as established information. For more information, please see our FAQs.
2 files

Transferable Multi-level Attention Neural Network for Accurate Prediction of Quantum Chemistry Properties via Multi-task Learning

submitted on 30.06.2020, 11:45 and posted on 02.07.2020, 04:47 by Ziteng Liu, Liqiang Lin, Qingqing Jia, Zheng Cheng, Yanyan Jiang, Yanwen Guo, Jing Ma

The development of efficient models for predicting specific properties through machine learning is of great importance for the innovation of chemistry and material science. However, predicting electronic structure properties like frontier molecular orbital HOMO and LUMO energy levels and their HOMO-LUMO gaps from the small-sized molecule data to larger molecules remains a challenge. Here we develop a multi-level attention strategy that enables chemical interpretable insights to be fused into multi-task learning of up to 110,000 records of data in QM9 for random split evaluation. The good transferability for predicting larger molecules outside the training set is demonstrated in both QM9 and Alchemy datasets. The efficient and accurate prediction of 12 properties including dipole moment, HOMO, and Gibbs free energy within chemical accuracy is achieved by using our specifically designed interpretable multi-level attention neural network, named as DeepMoleNet. Remarkably, the present multi-task deep learning model adopts the atom-centered symmetry functions (ACSFs) descriptor as one of the prediction targets, rather than using ACSFs as input in the conventional way. The proposed multi-level attention neural network is applicable to high-throughput screening of numerous chemical species to accelerate rational designs of drug, material, and chemical reactions.


the National Key Research and Development Program of China (2017YFB0702601)

he National Natural Science Foundation of China (grant nos. 21673111, 21873045)


Email Address of Submitting Author


Nanjing university



ORCID For Submitting Author


Declaration of Conflict of Interest

The authors declare no competing financial interest.

Version Notes



Logo branding