ChemRxiv
These are preliminary reports that have not been peer-reviewed. They should not be regarded as conclusive, guide clinical practice/health-related behavior, or be reported in news media as established information. For more information, please see our FAQs.
1/1
2 files

Transferable Multi-level Attention Neural Network for Accurate Prediction of Quantum Chemistry Properties via Multi-task Learning

preprint
submitted on 30.06.2020 and posted on 02.07.2020 by Ziteng Liu, Liqiang Lin, Qingqing Jia, Zheng Cheng, Yanyan Jiang, Yanwen Guo, Jing Ma

The development of efficient models for predicting specific properties through machine learning is of great importance for the innovation of chemistry and material science. However, predicting electronic structure properties like frontier molecular orbital HOMO and LUMO energy levels and their HOMO-LUMO gaps from the small-sized molecule data to larger molecules remains a challenge. Here we develop a multi-level attention strategy that enables chemical interpretable insights to be fused into multi-task learning of up to 110,000 records of data in QM9 for random split evaluation. The good transferability for predicting larger molecules outside the training set is demonstrated in both QM9 and Alchemy datasets. The efficient and accurate prediction of 12 properties including dipole moment, HOMO, and Gibbs free energy within chemical accuracy is achieved by using our specifically designed interpretable multi-level attention neural network, named as DeepMoleNet. Remarkably, the present multi-task deep learning model adopts the atom-centered symmetry functions (ACSFs) descriptor as one of the prediction targets, rather than using ACSFs as input in the conventional way. The proposed multi-level attention neural network is applicable to high-throughput screening of numerous chemical species to accelerate rational designs of drug, material, and chemical reactions.

Funding

the National Key Research and Development Program of China (2017YFB0702601)

he National Natural Science Foundation of China (grant nos. 21673111, 21873045)

History

Email Address of Submitting Author

majing@nju.edu.cn

Institution

Nanjing university

Country

China

ORCID For Submitting Author

0000-0001-5848-9775

Declaration of Conflict of Interest

The authors declare no competing financial interest.

Version Notes

2020.6.30_Version1

Exports

Logo branding

Exports