identifiability of linear compartment models
play

Identifiability of linear compartment models Anne Shiu Texas - PowerPoint PPT Presentation

Identifiability of linear compartment models Anne Shiu Texas A&M University ICERM 15 November 2018 From Algebraic Systems Biology: A Case Study for the Wnt Pathway (Elizabeth Gross, Heather Harrington, Zvi Rosen, Bernd Sturmfels 2016).


  1. Identifiability of linear compartment models Anne Shiu Texas A&M University ICERM 15 November 2018

  2. From Algebraic Systems Biology: A Case Study for the Wnt Pathway (Elizabeth Gross, Heather Harrington, Zvi Rosen, Bernd Sturmfels 2016).

  3. Outline ◮ Introduction: Linear compartment models ◮ Identifiability (via differential algebra) ◮ The singular locus Joint work with Elizabeth Gross, Heather Harrington, and Nicolette Meshkat arXiv:1709.10013 and arXiv:1810.05575

  4. Introduction

  5. Motivation: biological models Measured drug Drug concentration input Drug exchange Loss from blood Loss from organ

  6. Compartment model Example: Linear 2- y u 1 Compartment Model k 21 x 1 x 2 k 12 k 01 k 02 Structural identifiability : Recover parameters k ij from perfect input-output data u 1 ( t ) and y ( t )? (Bellman & Astrom 1970)

  7. Identifiability via differential algebra 1 : Which models are identifiable? 1 Ljung and Glad 1994

  8. Input-output equations ◮ Setup: a linear compartment model ◮ m = number of compartments ◮ Input-output equation : an equation that holds along any solution of the ODEs,

  9. Input-output equations ◮ Setup: a linear compartment model ◮ m = number of compartments ◮ Input-output equation : an equation that holds along any solution of the ODEs, involving only input variables u i and output variables y i (and parameters k ij ), and their derivatives

  10. Input-output equations ◮ Setup: a linear compartment model ◮ m = number of compartments ◮ Input-output equation : an equation that holds along any solution of the ODEs, involving only input variables u i and output variables y i (and parameters k ij ), and their derivatives ◮ Example, continued: k 01 k 02 k 21 in 1 2 k 12 y (2) + ( k 01 + k 02 + k 12 + k 21 ) y ′ 1 + ( k 01 k 12 + k 01 k 02 + k 02 k 21 ) y 1 = ( k 02 + k 12 ) u 1 1

  11. Input-output equations ◮ Setup: a linear compartment model ◮ m = number of compartments ◮ Input-output equation : an equation that holds along any solution of the ODEs, involving only input variables u i and output variables y i (and parameters k ij ), and their derivatives ◮ Example, continued: k 01 k 02 k 21 in 1 2 k 12 y (2) + ( k 01 + k 02 + k 12 + k 21 ) y ′ 1 + ( k 01 k 12 + k 01 k 02 + k 02 k 21 ) y 1 = ( k 02 + k 12 ) u 1 1 ◮ Input-output equations come from the elimination ideal: � differential eqns ., output eqns . y i = x j , their m derivatives � ∩ C ( k ij )[ u ( k ) , y ( k ) ] i i

  12. Input-output equations, continued k 01 k 02 k 21 in 1 2 k 12 � − k 01 − k 21 � k 12 x ′ ( t ) = Ax ( t ) + u ( t ) A = k 21 − k 02 − k 12 ◮ Proposition (Meshkat, Sullivant, Eisenberg 2015): For a linear compartment model with input and output in compartment-1 only, the input-output equation is: det( ∂I − A ) y 1 = det (( ∂I − A ) 11 ) u 1 .

  13. Input-output equations, continued k 01 k 02 k 21 in 1 2 k 12 � − k 01 − k 21 � k 12 x ′ ( t ) = Ax ( t ) + u ( t ) A = k 21 − k 02 − k 12 ◮ Proposition (Meshkat, Sullivant, Eisenberg 2015): For a linear compartment model with input and output in compartment-1 only, the input-output equation is: det( ∂I − A ) y 1 = det (( ∂I − A ) 11 ) u 1 . ◮ Proof uses Cramer’s Rule and Laplace expansion

  14. Input-output equations, continued k 01 k 21 k 32 1 2 3 in k 12 k 23

  15. Input-output equations, continued k 01 k 21 k 32 1 2 3 in k 12 k 23 det( ∂I − A ) y 1 = det (( ∂I − A ) 11 ) u 1   d/dt + k 01 + k 21 − k 12 0   y 1 det − k 21 d/dt + k 12 + k 32 − k 23 0 − k 32 d/dt + k 23 � d/dt + k 12 + k 32 � − k 23 = det u 1 − k 32 d/dt + k 23

  16. Input-output equations, continued k 01 k 21 k 32 1 2 3 in k 12 k 23 det( ∂I − A ) y 1 = det (( ∂I − A ) 11 ) u 1   d/dt + k 01 + k 21 − k 12 0   y 1 det − k 21 d/dt + k 12 + k 32 − k 23 0 − k 32 d/dt + k 23 � d/dt + k 12 + k 32 � − k 23 = det u 1 − k 32 d/dt + k 23 ... expands to the input-output equation : y (3) + ( k 01 + k 12 + k 21 + k 23 + k 32 ) y (2) 1 1 + ( k 01 k 12 + k 01 k 23 + k 01 k 32 + k 12 k 23 + k 21 k 23 + k 21 k 32 ) y ′ 1 + ( k 01 k 12 k 23 ) y 1 = u (2) + ( k 12 + k 23 + k 32 ) u ′ 1 + ( k 12 k 23 ) u 1 . 1

  17. Coefficients of input-output equations k 01 k 21 k 32 1 2 3 in k 12 k 23 y (3) + ( k 01 + k 12 + k 21 + k 23 + k 32 ) y (2) 1 1 + ( k 01 k 12 + k 01 k 23 + k 01 k 32 + k 12 k 23 + k 21 k 23 + k 21 k 32 ) y ′ 1 + ( k 01 k 12 k 23 ) y 1 = u (2) + ( k 12 + k 23 + k 32 ) u ′ 1 + ( k 12 k 23 ) u 1 . 1

  18. Coefficients of input-output equations k 01 k 21 k 32 1 2 3 in k 12 k 23 y (3) + ( k 01 + k 12 + k 21 + k 23 + k 32 ) y (2) 1 1 + ( k 01 k 12 + k 01 k 23 + k 01 k 32 + k 12 k 23 + k 21 k 23 + k 21 k 32 ) y ′ 1 + ( k 01 k 12 k 23 ) y 1 = u (2) + ( k 12 + k 23 + k 32 ) u ′ 1 + ( k 12 k 23 ) u 1 . 1 ◮ coefficient of y ( i ) corresponds to forests with (3 − i ) edges 1 and ≤ 1 outgoing edge per compartment ◮ coefficient of u ( i ) corresponds to ( n − i − 1)-edge forests: 1 k 12 k 32 in 2 3 k 23 ◮ Thm 1 : The coefficients correspond to forests in model.

  19. Identifiability y (3) + ( k 01 + k 12 + k 21 + k 23 + k 32 ) y (2) 1 1 + ( k 01 k 12 + k 01 k 23 + k 01 k 32 + k 12 k 23 + k 21 k 23 + k 21 k 32 ) y ′ 1 + ( k 01 k 12 k 23 ) y 1 = u (2) + ( k 12 + k 23 + k 32 ) u ′ 1 + ( k 12 k 23 ) u 1 . 1 ◮ (Generic, local) identifiability : can the parameters k ij be recovered from coefficients of input-output equations?

  20. Identifiability y (3) + ( k 01 + k 12 + k 21 + k 23 + k 32 ) y (2) 1 1 + ( k 01 k 12 + k 01 k 23 + k 01 k 32 + k 12 k 23 + k 21 k 23 + k 21 k 32 ) y ′ 1 + ( k 01 k 12 k 23 ) y 1 = u (2) + ( k 12 + k 23 + k 32 ) u ′ 1 + ( k 12 k 23 ) u 1 . 1 ◮ (Generic, local) identifiability : can the parameters k ij be recovered from coefficients of input-output equations? R 5 → R 5 ( k 01 , k 12 , k 21 , k 23 , k 32 ) �→ ( k 01 + k 12 + k 21 + k 23 + k 32 , . . . ) ◮ Solve directly, or use ... ◮ Proposition (Meshkat, Sullivant, Eisenberg 2015): Identifiable ⇔ Jacobian matrix of coefficient map has (full) rank = number of parameters

  21. Identifiability y (3) + ( k 01 + k 12 + k 21 + k 23 + k 32 ) y (2) 1 1 + ( k 01 k 12 + k 01 k 23 + k 01 k 32 + k 12 k 23 + k 21 k 23 + k 21 k 32 ) y ′ 1 + ( k 01 k 12 k 23 ) y 1 = u (2) + ( k 12 + k 23 + k 32 ) u ′ 1 + ( k 12 k 23 ) u 1 . 1 ◮ (Generic, local) identifiability : can the parameters k ij be recovered from coefficients of input-output equations? R 5 → R 5 ( k 01 , k 12 , k 21 , k 23 , k 32 ) �→ ( k 01 + k 12 + k 21 + k 23 + k 32 , . . . ) ◮ Solve directly, or use ... ◮ Proposition (Meshkat, Sullivant, Eisenberg 2015): Identifiable ⇔ Jacobian matrix of coefficient map has (full) rank = number of parameters generically

  22. The singular locus

  23. Definition ◮ Focus on the non-identifiable parameters: the singular locus is where the Jacobian matrix of coefficient map is rank-deficient. ◮ Example, continued: k 01 k 21 k 32 in 1 2 3 k 12 k 23 The equation of the singular locus is: det Jac = k 2 12 k 21 k 23 = 0 .

  24. Identifiable submodels ◮ Motivation : drug targets ◮ Thm 2 : Let M be an identifiable linear compartment model, with singular-locus equation f . Let � M be obtained from M by deleting edges I . ∈ � k ji | ( i, j ) ∈ I� , then � If f / M is identifiable.

  25. Identifiable submodels ◮ Motivation : drug targets ◮ Thm 2 : Let M be an identifiable linear compartment model, with singular-locus equation f . Let � M be obtained from M by deleting edges I . ∈ � k ji | ( i, j ) ∈ I� , then � If f / M is identifiable. ◮ Example: k 01 k 21 1 2 in k 12 k 14 k 23 k 32 4 3 k 43 f = k 12 k 14 k 2 21 k 32 ( k 12 k 14 − k 2 14 − . . . )( k 12 k 23 + k 12 k 43 + k 32 k 43 ) .

  26. Identifiable submodels ◮ Motivation : drug targets ◮ Thm 2 : Let M be an identifiable linear compartment model, with singular-locus equation f . Let � M be obtained from M by deleting edges I . ∈ � k ji | ( i, j ) ∈ I� , then � If f / M is identifiable. ◮ Example: k 01 k 21 1 2 in k 12 k 14 k 23 k 32 4 3 k 43 f = k 12 k 14 k 2 21 k 32 ( k 12 k 14 − k 2 14 − . . . )( k 12 k 23 + k 12 k 43 + k 32 k 43 ) . ◮ Converse is false : deleting k 12 and k 23 is identifiable!

  27. Cycle and mammillary models k n,n − 1 n k 1 ,n n k n, 1 k 01 k 01 k 1 n . . . in 1 in 1 k 13 3 k 31 k 21 k 43 k 12 2 3 k 21 k 32 2 Cycle Mammillary (star) ◮ Thm 3 : ◮ The singular-locus equation for the Cycle model is � k 32 k 43 . . . k n,n − 1 k 1 ,n 2 ≤ i<j ≤ n ( k i +1 ,i − k j +1 ,j ). ◮ The singular-locus equation for the Mammillary model is � 2 ≤ i<j ≤ n ( k 1 i − k 1 j ) 2 . k 12 k 13 . . . k 1 ,n

  28. Catenary (path) models 0 1 in 1 2 0 0 2 1 1 2 3 in 1 0 0 3 2 1 1 2 3 4 in 2 1 0

  29. Catenary (path) models 0 1 in 1 2 0 0 2 1 1 2 3 in 1 0 0 3 2 1 1 2 3 4 in 2 1 0 Conjecture: For catenary models , the exponents in the singular-locus equation generalize the pattern above.

  30. Tree conjecture 0 3 2 1 in 1 2 3 4 2 1 0 2 1 2 3 4 (2+1)+1=4 1 0 0 2+1=3 1 0 in 1 5 2 1 1 6 7 0

  31. Tree conjecture 0 3 2 1 in 1 2 3 4 2 1 0 2 1 2 3 4 (2+1)+1=4 1 0 0 2+1=3 1 0 in 1 5 2 1 1 6 7 0 Conj.: (Hoch, Sweeney, Tung) For tree models , the exponents in the singular-locus equation generalize the pattern above.

Recommend


More recommend