On the Approximation Quality of Markov State Models
Citations Over TimeTop 1% of 2010 papers
Abstract
We consider a continuous-time Markov process on a large continuous or discrete state space. The process is assumed to have strong enough ergodicity properties and to exhibit a number of metastable sets. Markov state models (MSMs) are designed to represent the effective dynamics of such a process by a Markov chain that jumps between the metastable sets with the transition rates of the original process. MSMs have been used for a number of applications, including molecular dynamics, for more than a decade. Their approximation quality, however, has not yet been fully understood. In particular, it would be desirable to have a sharp error bound for the difference in propagation of probability densities between the MSM and the original process on long timescales. Here, we provide such a bound for a rather general class of Markov processes ranging from diffusions in energy landscapes to Markov jump processes on large discrete spaces. Furthermore, we discuss how this result provides formal support or shows the limitations of algorithmic strategies that have been found to be useful for the construction of MSMs. Our findings are illustrated by numerical experiments.
Related Papers
- → An Introduction to Markov Chains(2019)22 cited
- → Ergodicity of Markov Processes via Nonstandard Analysis(2021)15 cited
- → Entropy rate of continuous-state hidden Markov chains(2010)3 cited
- → Predictions Using n ‐State Markov Chains(2017)1 cited
- → On the Markov Property of the Occupation Time for Continuous-Time Inhomogeneous Markov Chains(2015)