coarse graining markov state models with pcca coarse
play

Coarse-graining Markov state models with PCCA Coarse-graining - PowerPoint PPT Presentation

Coarse-graining Markov state models with PCCA Coarse-graining Markov state models Coarse-graining Markov state models here means finding a smaller transition matrix that does a similar job as the large original transition matrix. We have


  1. Coarse-graining Markov state models with PCCA

  2. Coarse-graining Markov state models • Coarse-graining Markov state models here means finding a smaller transition matrix that does a similar job as the large original transition matrix. • We have already seen one way of reducing the dimension of a transition matrix. Let’s take this as our starting point…

  3. The truncated eigendecomposition • The eigendecomposition of !(#) reads !(#) = &' # ( • We have seen that for sufficiently large lag times # , the majority of eigenvalues become almost zero. • We can therefore truncate the matrix '(#) . 1 0 0 ⋯ 0 0 0 0.99 0 ⋯ 0 0 0 0 ⋱ ⋱ 0 0 ⋮ ⋮ ⋱ ⋱ 0 0 0 0 0 0 0 0 0 0 0 0 0 0 Delete this and call the reduced matrix 0 ' . We can also ignore the corresponding eigenvectors in & , ( and call the reduced matrix 0 & , 1 ( .

  4. The truncated eigendecomposition • We now have ! " ≈ $ %$ &(")) * . • And also ! " + ≈ $ %$ & + (")) * since ) *$ % = Id . • So did we find what we wanted? • $ &(") replaces ! for large " ✓ • $ &(") is a small matrix ✓ • But $ &(") is not a transition matrix. e.g. $ &/ ≠ / • Can we correct the last point?

  5. A closer look at the eigenvectors

  6. A closer look at the eigenvectors The dominant eigenvectors can be linearly • transformed into a indicator vectors for the metastable states. These indicators are called memberships . • " ## " $# " %# " &# " ## " $$ " %% " &$ = " #% " $% " %% " &% " #& " $& " %% " && * + ( ' ( ) ( =

  7. Coarse-graining with PCCA • Use eigendecomposition and insert !! "# : $ = & '( ) * + = & '! ! "# ( ) ! ! "# * + $ - • We have $ , = & , ! "# * '!$ - + • Are we done now? • $ - replaces $ for large ) ✓ Same eigenvalue as $ ✓ • $ - is a small matrix ✓ • $ - . = . (without proof) ✓ • $ - can be interpreted as the transition matrix between the metastable states. ✓ • $ - is a Koopman matrix. (without proof) ✓ • $ - ≱ 0

  8. PCCA in PyEmma • ! ... metastable memberships • "! ... metastable distributions • argmax ( χ *( … metastable assignments • + * = {. ∣ argmax ( χ 0( = 1} … metastable sets + * *34,…,7

  9. Further reading • Susanna Röblitz, Marcus Weber, “Fuzzy spectral clustering by PCCA+: application to Markov state models and data classification”, Advances in Data Analysis and Classification , 7 , 147 (2013) • Marcus Weber, Konstantin Fackeldey, "G-PCCA: Spectral Clustering for Non-reversible Markov Chains", Konrad-Zuse-Zentrum für Informationstechnik Berlin, ZIB-Report 15-35 (2015)

  10. Appendix: Proof that ! " # = # • Memberships must sum to one %# &×( = # )×( • The first right eigenvector is constant *+ ( = # )×( . • ⇒ %# &×( = *+ ( • Use definition of % : %# &×( = *-# &×( • Therefore *+ ( = *-# &×( which is satisfied by -# &×( = + ( . • ⇒ ! " # = - .( / 0 -# = - .( / 0 + ( = - .( + ( = #

  11. Appendix: Computing A + , = diag(( ) * ) 2) Cov %, % = ( ) * ) +*( Stationary weight of the metastable states Overlap matrix of metastable states, Inserted into the diagonal of a matrix. weighted by stationary distribution ≈ 67 ( ) * ) +*() → min tr(+ ,

  12. • ! ∈ ℝ $×& matrix of dominant eigenvectors • ' ∈ ℝ $×& matrix of memberships • ' ≥ 0 non-negativity & • ∑ +,- ' = 1 partition of 1 • ' ≈ !1 spectral clustering

Recommend


More recommend