These are preliminary reports that have not been peer-reviewed. They should not be regarded as conclusive, guide clinical practice/health-related behavior, or be reported in news media as established information. For more information, please see our FAQs.
Manuscript.pdf (1.45 MB)

Deep Learning for Variational Multi-Scale Molecular Modeling

revised on 25.06.2020 and posted on 29.06.2020 by Jun Zhang, Yaokun Lei, Yi Isaac Yang, Yi Qin Gao
Molecular simulations are widely applied in the study of chemical and bio-physical systems. However, the
accessible timescales of atomistic simulations are limited, and extracting equilibrium properties of systems
containing rare events remains challenging. Two distinct strategies are usually adopted in this regard: either
sticking to the atomistic level and performing enhanced sampling, or trading details for speed by leveraging
coarse-grained models. Although both strategies are promising, either of them, if adopted individually,
exhibits severe limitations. In this paper we propose a machine-learning approach to ally both strategies so
that simulations on different scales can benefit mutually from their cross-talks: Accurate coarse-grained (CG)
models can be inferred from the fine-grained (FG) simulations through deep generative learning; In turn, FG
simulations can be boosted by the guidance of CG models via deep reinforcement learning. Our method
defines a variational and adaptive training objective which allows end-to-end training of parametric
molecular models using deep neural networks. Through multiple experiments, we show that our method is
efficient and flexible, and performs well on challenging chemical and bio-molecular systems.


Alexander von Humboldt Fellowship

National Natural Science Foundation of China

National Natural Science Foundation of China

Find out more...

National Key Research and Development Program of China

Guangdong Basic and Applied Basic Research Foundation


Email Address of Submitting Author


Freie Universität Berlin



ORCID For Submitting Author


Declaration of Conflict of Interest

The authors declare no conflict of interest.

Version Notes

Version 4.0