Abstract
Markov state models (MSM) are a popular statistical method for analyzing the conformational dynamics of proteins, including protein folding. With all statistical and machine learning (ML) models choices must be made about the modeling pipeline that cannot be directly learned from the data. These choices, or hyperparameters, are often evaluated by expert judgment or, in the case of MSMs, by maximizing variational scores such as the VAMP-2 score. Modern ML and statistical pipelines often use automatic hyperparameter selection techniques ranging from the simple: choosing the best score from a random selection of hyperparameters to the complex: optimization via e.g., Bayesian optimization. In this work, we ask whether it is possible to automatically select MSM models this way by estimating and analysing over 16'000'000 observations from over 280'000 estimated MSMs. We find that differences in hyperparameters can change the physical interpretation of the optimization objective making automatic selection difficult. In addition, we find that enforcing conditions of equilibrium in the VAMP scores can result in inconsistent model selection. However, other parameters which specify the VAMP-2 score (lag time and number of relaxation processes scored) have only negligible influence on model selection. We suggest that model observables and variational scores should only be a guide to model selection and that a full investigation of the MSM properties be undertaken when selecting hyperparameters.
Supplementary materials
Title
Supplementary information for 'Markov state models: to optimize or not to optimize'
Description
This is further information on the Markov state models and statistical analyses discussed in the main paper.
Actions