ChemRxiv
These are preliminary reports that have not been peer-reviewed. They should not be regarded as conclusive, guide clinical practice/health-related behavior, or be reported in news media as established information. For more information, please see our FAQs.
Data Augmentation and Transfer Learning Strategies for Reaction Prediction in Low Chemical Data Regimes.pdf (1.15 MB)

Data Augmentation and Transfer Learning Strategies for Reaction Prediction in Low Chemical Data Regimes

preprint
submitted on 16.12.2020, 02:40 and posted on 17.12.2020, 13:02 by Yun Zhang, Ling Wang, Xinqiao Wang, Chengyun Zhang, Jiamin Ge, Jing Tang, An Su, Hongliang Duan

Abstract: Effective and rapid deep learning method to predict chemical reactions contributes to the research and development of organic chemistry and drug discovery. Despite the outstanding capability of deep learning in retrosynthesis and forward synthesis, predictions based on small chemical datasets generally result in low accuracy due to an insufficiency of reaction examples. Here, we introduce a new state art of method, which integrates transfer learning with transformer model to predict the outcomes of the Baeyer-Villiger reaction which is a representative small dataset reaction. The results demonstrate that introducing transfer learning strategy markedly improves the top-1 accuracy of the transformer-transfer learning model (81.8%) over that of the transformer-baseline model (58.4%). Moreover, we further introduce data augmentation to the input reaction SMILES, which allows for better performance and improves the accuracy of the transformer-transfer learning model (86.7%). In summary, both transfer learning and data augmentation methods significantly improve the predictive performance of transformer model, which are powerful methods used in chemistry field to eliminate the restriction of limited training data.

Funding

National Natural Science Foundation of China, NSFC (Grant No.81903438)

History

Email Address of Submitting Author

hduan@zjut.edu.cn

Institution

Zhejiang University of Technology

Country

China

ORCID For Submitting Author

0000-0002-9194-0115

Declaration of Conflict of Interest

No conflict of interest

Exports