decision problems for linear recurrence sequences
play

Decision Problems for Linear Recurrence Sequences Jo el Ouaknine - PowerPoint PPT Presentation

Decision Problems for Linear Recurrence Sequences Jo el Ouaknine Department of Computer Science, Oxford University (Joint work with James Worrell and Matt Daws) Algorithms Workshop Oxford, October 2012 Termination of Simple Linear Programs


  1. Decision Problems for Linear Recurrence Sequences Jo¨ el Ouaknine Department of Computer Science, Oxford University (Joint work with James Worrell and Matt Daws) Algorithms Workshop Oxford, October 2012

  2. Termination of Simple Linear Programs x := a ; while cond ( x ) do x := M · x + b ;

  3. Termination of Simple Linear Programs x := a ; while cond ( x ) do x := M · x + b ; where cond ( x ) is linear, e.g. ‘ u · x � = 0’ or ‘ u · x ≥ 5’.

  4. Termination of Simple Linear Programs x := a ; while cond ( x ) do x := M · x + b ; where cond ( x ) is linear, e.g. ‘ u · x � = 0’ or ‘ u · x ≥ 5’. Termination Problem Instance: � a ; cond ; M ; b � Question: Does this program terminate?

  5. Termination of Simple Linear Programs Much work on this and related problems in the literature over the last three decades: Manna, Pnueli, Kannan, Lipton, Sagiv, Podelski, Rybalchenko, Cook, Dershowitz, Tiwari, Braverman, Ben-Amram, Genaim, . . . Approaches include: linear ranking functions size-change termination methods spectral techniques . . . Tools include:

  6. Reachability and Invariance in Markov Chains M : Markov chain over states s 1 , . . . , s k

  7. Reachability and Invariance in Markov Chains M : Markov chain over states s 1 , . . . , s k Is it the case, say, that starting in state s 1 , ultimately I am in state s k with probability at least 1 / 2 ?

  8. Reachability and Invariance in Markov Chains M : Markov chain over states s 1 , . . . , s k Is it the case, say, that starting in state s 1 , ultimately I am in state s k with probability at least 1 / 2 ? Does there exist T such that, for all n ≥ T Prob(‘being in s k after n steps’) ≥ 1 / 2 ?

  9. Reachability and Invariance in Markov Chains M : Markov chain over states s 1 , . . . , s k Is it the case, say, that starting in state s 1 , ultimately I am in state s k with probability at least 1 / 2 ? Does there exist T such that, for all n ≥ T Prob(‘being in s k after n steps’) ≥ 1 / 2 ? (1 , 0 , 0 , 0)

  10. Reachability and Invariance in Markov Chains M : Markov chain over states s 1 , . . . , s k Is it the case, say, that starting in state s 1 , ultimately I am in state s k with probability at least 1 / 2 ? Does there exist T such that, for all n ≥ T Prob(‘being in s k after n steps’) ≥ 1 / 2 ? (1 , 0 , 0 , 0) · M = (0 , 0 . 5 , 0 . 2 , 0 . 3)

  11. Reachability and Invariance in Markov Chains M : Markov chain over states s 1 , . . . , s k Is it the case, say, that starting in state s 1 , ultimately I am in state s k with probability at least 1 / 2 ? Does there exist T such that, for all n ≥ T Prob(‘being in s k after n steps’) ≥ 1 / 2 ? (1 , 0 , 0 , 0) · M = (0 , 0 . 5 , 0 . 2 , 0 . 3) · M = (0 . 16 , 0 , 0 . 5 , 0 . 34)

  12. Reachability and Invariance in Markov Chains M : Markov chain over states s 1 , . . . , s k Is it the case, say, that starting in state s 1 , ultimately I am in state s k with probability at least 1 / 2 ? Does there exist T such that, for all n ≥ T Prob(‘being in s k after n steps’) ≥ 1 / 2 ? (1 , 0 , 0 , 0) · M = (0 , 0 . 5 , 0 . 2 , 0 . 3) · M = (0 . 16 , 0 , 0 . 5 , 0 . 34) · M = (0 . 318 , 0 . 08 , 0 . 032 , 0 . 57)

  13. Reachability and Invariance in Markov Chains M : Markov chain over states s 1 , . . . , s k Is it the case, say, that starting in state s 1 , ultimately I am in state s k with probability at least 1 / 2 ? Does there exist T such that, for all n ≥ T Prob(‘being in s k after n steps’) ≥ 1 / 2 ? (1 , 0 , 0 , 0) · M = (0 , 0 . 5 , 0 . 2 , 0 . 3) · M = (0 . 16 , 0 , 0 . 5 , 0 . 34) · M = (0 . 318 , 0 . 08 , 0 . 032 , 0 . 57) · M = (0 . 13 , 0 . 159 , 0 . 1436 , 0 . 5674)

  14. Reachability and Invariance in Markov Chains M : Markov chain over states s 1 , . . . , s k Is it the case, say, that starting in state s 1 , ultimately I am in state s k with probability at least 1 / 2 ? Does there exist T such that, for all n ≥ T Prob(‘being in s k after n steps’) ≥ 1 / 2 ? (1 , 0 , 0 , 0) · M = (0 , 0 . 5 , 0 . 2 , 0 . 3) · M = (0 . 16 , 0 , 0 . 5 , 0 . 34) · M = (0 . 318 , 0 . 08 , 0 . 032 , 0 . 57) · M = (0 . 13 , 0 . 159 , 0 . 1436 , 0 . 5674) · M = (0 . 18528 , 0 . 065 , 0 . 185 , 0 . 56472)

  15. Reachability and Invariance in Markov Chains M : Markov chain over states s 1 , . . . , s k Is it the case, say, that starting in state s 1 , ultimately I am in state s k with probability at least 1 / 2 ? Does there exist T such that, for all n ≥ T Prob(‘being in s k after n steps’) ≥ 1 / 2 ? (1 , 0 , 0 , 0) · M = (0 , 0 . 5 , 0 . 2 , 0 . 3) · M = (0 . 16 , 0 , 0 . 5 , 0 . 34) · M = (0 . 318 , 0 . 08 , 0 . 032 , 0 . 57) · M = (0 . 13 , 0 . 159 , 0 . 1436 , 0 . 5674) · M = (0 . 18528 , 0 . 065 , 0 . 185 , 0 . 56472) · M = (0 . 205444 , 0 . 09264 , 0 . 102056 , 0 . 59986)

  16. Reachability and Invariance in Markov Chains M : Markov chain over states s 1 , . . . , s k Is it the case, say, that starting in state s 1 , ultimately I am in state s k with probability at least 1 / 2 ? Does there exist T such that, for all n ≥ T Prob(‘being in s k after n steps’) ≥ 1 / 2 ? (1 , 0 , 0 , 0) · M = (0 , 0 . 5 , 0 . 2 , 0 . 3) · M = (0 . 16 , 0 , 0 . 5 , 0 . 34) · M = (0 . 318 , 0 . 08 , 0 . 032 , 0 . 57) · M = (0 . 13 , 0 . 159 , 0 . 1436 , 0 . 5674) · M = (0 . 18528 , 0 . 065 , 0 . 185 , 0 . 56472) · M = (0 . 205444 , 0 . 09264 , 0 . 102056 , 0 . 59986) · M = (0 . 171 , 0 . 102722 , 0 . 133729 , 0 . 592549)

  17. Reachability and Invariance in Markov Chains M : Markov chain over states s 1 , . . . , s k Is it the case, say, that starting in state s 1 , ultimately I am in state s k with probability at least 1 / 2 ? Does there exist T such that, for all n ≥ T Prob(‘being in s k after n steps’) ≥ 1 / 2 ? (1 , 0 , 0 , 0) · M = (0 , 0 . 5 , 0 . 2 , 0 . 3) · M = (0 . 16 , 0 , 0 . 5 , 0 . 34) · M = (0 . 318 , 0 . 08 , 0 . 032 , 0 . 57) · M = (0 . 13 , 0 . 159 , 0 . 1436 , 0 . 5674) · M = (0 . 18528 , 0 . 065 , 0 . 185 , 0 . 56472) · M = (0 . 205444 , 0 . 09264 , 0 . 102056 , 0 . 59986) · M = (0 . 171 , 0 . 102722 , 0 . 133729 , 0 . 592549) · M = (0 . 185374 , 0 . 0855 , 0 . 136922 , 0 . 592204)

  18. Reachability and Invariance in Markov Chains M : Markov chain over states s 1 , . . . , s k Is it the case, say, that starting in state s 1 , ultimately I am in state s k with probability at least 1 / 2 ? Does there exist T such that, for all n ≥ T Prob(‘being in s k after n steps’) ≥ 1 / 2 ? (1 , 0 , 0 , 0) · M = (0 , 0 . 5 , 0 . 2 , 0 . 3) · M = (0 . 16 , 0 , 0 . 5 , 0 . 34) · M = (0 . 318 , 0 . 08 , 0 . 032 , 0 . 57) · M = (0 . 13 , 0 . 159 , 0 . 1436 , 0 . 5674) · M = (0 . 18528 , 0 . 065 , 0 . 185 , 0 . 56472) · M = (0 . 205444 , 0 . 09264 , 0 . 102056 , 0 . 59986) · M = (0 . 171 , 0 . 102722 , 0 . 133729 , 0 . 592549) · M = (0 . 185374 , 0 . 0855 , 0 . 136922 , 0 . 592204)

  19. Reachability and Invariance in Markov Chains M : Markov chain over states s 1 , . . . , s k Is it the case, say, that starting in state s 1 , ultimately I am in state s k with probability at least 1 / 2 ? Does there exist T such that, for all n ≥ T Prob(‘being in s k after n steps’) ≥ 1 / 2 ? Ultimate Invariance Problem Instance: � stochastic matrix M ; r ∈ (0 , 1] �   0 . .   Question: Does ∃ T s.t. ∀ n ≥ T , (1 , 0 , . . . , 0) · M n · .  ≥ r ?     0  1

  20. Positivity of Linear Recurrence Sequences u 0 = 1, u 1 = 1 u n +2 = u n +1 + u n

  21. Positivity of Linear Recurrence Sequences u 0 = 1, u 1 = 1 u n +2 = u n +1 + u n

  22. Positivity of Linear Recurrence Sequences u 0 = 1, u 1 = 1 u n +2 = u n +1 + u n 1, 1, 2, 3, 5, 8, 13, 21, 34, 55, 89, 144, 233, 377, . . .

  23. Positivity of Linear Recurrence Sequences u 0 = 1, u 1 = 1 u n +5 = u n +4 + u n +3 − 1 3 u n 1, 1, 2, 3, 5, 8, 13, 21, 34, 55, 89, 144, 233, 377, . . .

  24. Positivity of Linear Recurrence Sequences u 0 = 1, u 1 = 1, u 2 = 2, u 3 = 3, u 4 = 5 u n +5 = u n +4 + u n +3 − 1 3 u n 1, 1, 2, 3, 5, 8, 13, 21, 34, 55, 89, 144, 233, 377, . . .

  25. Positivity of Linear Recurrence Sequences u 0 = 1, u 1 = 1, u 2 = 2, u 3 = 3, u 4 = 5 u n +5 = u n +4 + u n +3 − 1 3 u n − 10 w n +5 1, 1, 2, 3, 5, 8, 13, 21, 34, 55, 89, 144, 233, 377, . . .

  26. Positivity of Linear Recurrence Sequences u 0 = 1, u 1 = 1, u 2 = 2, u 3 = 3, u 4 = 5 u n +5 = u n +4 + u n +3 − 1 3 u n − 10 w n +5 1, 1, 2, 3, 5, 8, 13, 21, 34, 55, 89, 144, 233, 377, . . . Positivity Problem Instance: A linear recurrence sequence � u n � Question: Is it the case that ∀ n , u n ≥ 0 ?

  27. Sample Decision Problems Termination Problem for x := a ; Simple Linear Programs while u · x � = 0 do Instance: � a ; u ; M ; b � over Z x := M · x + b ; Question: Does this program terminate? Ultimate Invariance Problem for Markov Chains Instance: A stochastic matrix M over Q   0 . . Question: Does ∃ T s.t. ∀ n ≥ T , (1 , 0 , . . . , 0) · M n ·  .   ≥ 1 2 ?     0  1 Positivity Problem for Linear Recurrence Sequences Instance: A linear recurrence sequence � u n � over Z or Q Question: Is it the case that ∀ n , u n ≥ 0 ?

  28. Linear Recurrence Sequences Definition A linear recurrence sequence is a sequence � u 0 , u 1 , u 2 , . . . � of real numbers such that there exist k and constants a 1 , . . . , a k , such that ∀ n ≥ 0, u n + k = a 1 u n + k − 1 + a 2 u n + k − 2 + . . . + a k u n . k is the order of the sequence

Recommend


More recommend