Abstract
Time-series learning using purely data-driven models struggles to emulate physical dynamic systems effectively, primarily due to the lack of relevant physical constraints. Here, we introduce two time-series learning
architectures based on neural ordinary differential equations (NODEs)—continuous normalizing flow (CNF)
and Hamiltonian neural networks (HNN)—to model nonadiabatic molecular dynamics (NAMD). CNF incorporates the mathematical constraint of log-normalized energy gap distributions into the loss function, enhancing
the model’s ability to handle monotonic changes in state populations in photophysical systems. However, CNF
is less effective in cases involving significant back-hopping during nonadiabatic transitions. To address this,
we employ HNN, which integrates the physical constraint of the Hamiltonian mechanism. This enables HNN
to learn vector fields from observed NAMD trajectories, allowing it to accurately model nuclear propagation
and coupled nonadiabatic transitions. These two architectures have provided potential solutions for ultrafast
dynamics simulations. The CNF model effectively captures photophysical processes without the need for intricate parameter tuning, while the HNN model excels in simulating photochemical-induced configurational
reorganization.
Supplementary materials
Title
Supporting Information
Description
The Supporting Information includes the Multi-state learning approach of HNN, the structures of the test molecules, and all validation results of the CNF and HNN models.
Actions