advanced algorithms vi
play

Advanced Algorithms (VI) Shanghai Jiao Tong University Chihao Zhang - PowerPoint PPT Presentation

Advanced Algorithms (VI) Shanghai Jiao Tong University Chihao Zhang April 13, 2020 Martingale Martingale Let be a sequence of random variables { X t } t 0 Martingale Let be a sequence of random variables { X t } t 0 Let be a


  1. Advanced Algorithms (VI) Shanghai Jiao Tong University Chihao Zhang April 13, 2020

  2. Martingale

  3. Martingale Let be a sequence of random variables { X t } t ≥ 0

  4. Martingale Let be a sequence of random variables { X t } t ≥ 0 Let be a sequence of -algebras such that { ℱ t } t ≥ 0 σ ℱ 0 ⊆ ℱ 1 ⊆ ℱ 2 ⋯

  5. Martingale Let be a sequence of random variables { X t } t ≥ 0 Let be a sequence of -algebras such that { ℱ t } t ≥ 0 σ filtration ℱ 0 ⊆ ℱ 1 ⊆ ℱ 2 ⋯

  6. Martingale Let be a sequence of random variables { X t } t ≥ 0 Let be a sequence of -algebras such that { ℱ t } t ≥ 0 σ filtration ℱ 0 ⊆ ℱ 1 ⊆ ℱ 2 ⋯ A martingale is a sequence of pairs s.t. { X t , ℱ t } t ≥ 0 • for all is -measurable t ≥ 0, X t ℱ t • for all , t ≥ 0 E [ X t +1 ∣ ℱ t ] = X t

  7. Stopping Time

  8. Stopping Time The stopping time is a random variable τ ∈ ℕ ∪ { ∞ } such that is -measurable for all [ τ ≤ t ] ℱ t t

  9. Stopping Time The stopping time is a random variable τ ∈ ℕ ∪ { ∞ } such that is -measurable for all [ τ ≤ t ] ℱ t t “whether to stop can be determined by looking at the outcomes seen so far”

  10. Stopping Time The stopping time is a random variable τ ∈ ℕ ∪ { ∞ } such that is -measurable for all [ τ ≤ t ] ℱ t t “whether to stop can be determined by looking at the outcomes seen so far” • The first time a gambler wins five games in a row • The last time a gambler wins five games in a row

  11. A basic property of a martingale is { X t , ℱ t } t ≥ 0 for any E [ X t ] = E [ X 0 ] t ≥ 0

  12. A basic property of a martingale is { X t , ℱ t } t ≥ 0 for any E [ X t ] = E [ X 0 ] t ≥ 0 Proof. , ∀ t ≥ 1 E [ X t ] = E [ E [ X t ∣ ℱ t − 1 ]] = E [ X t − 1 ]

  13. A basic property of a martingale is { X t , ℱ t } t ≥ 0 for any E [ X t ] = E [ X 0 ] t ≥ 0 Proof. , ∀ t ≥ 1 E [ X t ] = E [ E [ X t ∣ ℱ t − 1 ]] = E [ X t − 1 ] Does hold for a (randomized) E [ X τ ] = E [ X 0 ] stopping time ? τ

  14. A basic property of a martingale is { X t , ℱ t } t ≥ 0 for any E [ X t ] = E [ X 0 ] t ≥ 0 Proof. , ∀ t ≥ 1 E [ X t ] = E [ E [ X t ∣ ℱ t − 1 ]] = E [ X t − 1 ] Does hold for a (randomized) E [ X τ ] = E [ X 0 ] stopping time ? τ Not true in general. Assume is the first time a τ gambler wins $100

  15. Optional Stopping Theorem

  16. Optional Stopping Theorem For a stopping time , holds if τ E [ X τ ] = E [ X 0 ]

  17. Optional Stopping Theorem For a stopping time , holds if τ E [ X τ ] = E [ X 0 ] • Pr[ τ < ∞ ] = 1 • E [ | X τ | ] < ∞ • t →∞ E [ X t ⋅ 1 [ τ > t ] ] = 0 lim

  18. The following conditions are stronger, but easier to verify

  19. The following conditions are stronger, but easier to verify 1. There is a fixed such that a.s. τ ≤ n n 2. and there is a fixed such that Pr[ τ < ∞ ] = 1 M for all | X t | ≤ M t ≤ τ 3. and there is a fixed such that E [ τ ] < ∞ c for all t < τ | X t +1 − X t | ≤ c

  20. The following conditions are stronger, but easier to verify 1. There is a fixed such that a.s. τ ≤ n n 2. and there is a fixed such that Pr[ τ < ∞ ] = 1 M for all | X t | ≤ M t ≤ τ 3. and there is a fixed such that E [ τ ] < ∞ c for all t < τ | X t +1 − X t | ≤ c OST applies when at least one of above holds

  21. Proof of the Optional Stopping Theorem

  22. Applications of OST

  23. Random Walk in 1-D

  24. Random Walk in 1-D t ∑ Let u.a.r. and Z t ∈ { − 1, + 1} X t = Z i i =1

  25. Random Walk in 1-D t ∑ Let u.a.r. and Z t ∈ { − 1, + 1} X t = Z i i =1 The random walk stops when it hits or − a < 0 b > 0

  26. Random Walk in 1-D t ∑ Let u.a.r. and Z t ∈ { − 1, + 1} X t = Z i i =1 The random walk stops when it hits or − a < 0 b > 0 Let be the time it stops. is a stopping time τ τ

  27. Random Walk in 1-D t ∑ Let u.a.r. and Z t ∈ { − 1, + 1} X t = Z i i =1 The random walk stops when it hits or − a < 0 b > 0 Let be the time it stops. is a stopping time τ τ What is ? E [ τ ]

  28. The random walk stops when one of two ends is arrived

  29. The random walk stops when one of two ends is arrived We first determine , the probability that the walk p a ends at , using OST − a

  30. The random walk stops when one of two ends is arrived We first determine , the probability that the walk p a ends at , using OST − a

  31. The random walk stops when one of two ends is arrived We first determine , the probability that the walk p a ends at , using OST − a E [ X τ ] = p a ( − a ) + (1 − p a ) b = E [ X 0 ] = 0

  32. The random walk stops when one of two ends is arrived We first determine , the probability that the walk p a ends at , using OST − a E [ X τ ] = p a ( − a ) + (1 − p a ) b = E [ X 0 ] = 0 b ⟹ p a = a + b

  33. Now define a random variable Y t = X 2 t − t

  34. Now define a random variable Y t = X 2 t − t Claim. is a martingale { Y t } t ≥ 0

  35. Now define a random variable Y t = X 2 t − t Claim. is a martingale { Y t } t ≥ 0 E [ Y t +1 ∣ ℱ t ] = E [( X t + Z t +1 ) 2 − ( t + 1) ∣ ℱ t ] = E [ X 2 t + 2 Z t +1 X t − t ∣ ℱ t ] = X 2 t − t = Y t

  36. satisfies the condition for Y τ OST, so

  37. satisfies the condition for Y τ OST, so E [ Y τ ] = E [ X 2 τ ] − E [ τ ] = E [ Y 0 ] = 0

  38. satisfies the condition for Y τ OST, so E [ Y τ ] = E [ X 2 τ ] − E [ τ ] = E [ Y 0 ] = 0 On the other hand, we have

  39. satisfies the condition for Y τ OST, so E [ Y τ ] = E [ X 2 τ ] − E [ τ ] = E [ Y 0 ] = 0 On the other hand, we have τ ] = p a ⋅ a 2 + (1 − p a ) ⋅ b 2 = ab E [ X 2

  40. satisfies the condition for Y τ OST, so E [ Y τ ] = E [ X 2 τ ] − E [ τ ] = E [ Y 0 ] = 0 On the other hand, we have τ ] = p a ⋅ a 2 + (1 − p a ) ⋅ b 2 = ab E [ X 2 This implies E [ τ ] = ab

  41. Wald’s Equation

  42. Wald’s Equation E [ X i ] N ∑ Recall in Week two, we consider the sum i where are independent with mean and is a { X i } N μ random variable

  43. Wald’s Equation E [ X i ] N ∑ Recall in Week two, we consider the sum i where are independent with mean and is a { X i } N μ random variable

  44. Wald’s Equation E [ X i ] N ∑ Recall in Week two, we consider the sum i where are independent with mean and is a { X i } N μ random variable We are now ready to prove the general case!

  45. t ∑ Assume is finite and let E [ N ] Y t = ( X i − μ ) i =1

  46. t ∑ Assume is finite and let E [ N ] Y t = ( X i − μ ) i =1 is a martingale and the stopping time satisfies { Y t } N the conditions for OST

  47. t ∑ Assume is finite and let E [ N ] Y t = ( X i − μ ) i =1 is a martingale and the stopping time satisfies { Y t } N the conditions for OST E [ Y N ] = E [ ( X i − μ ) ] = E [ X i ] − E [ μ ] N N N ∑ ∑ ∑ i =1 i =1 i =1 = E [ X i ] − E [ N ] ⋅ μ = 0 N ∑ i =1

  48. Waiting Time for Patterns

  49. Waiting Time for Patterns Fix a pattern “00110” P =

  50. Waiting Time for Patterns Fix a pattern “00110” P = How many fair coins one needs to toss to see P for the first time (in expectation)?

  51. Waiting Time for Patterns Fix a pattern “00110” P = How many fair coins one needs to toss to see P for the first time (in expectation)? The number can be calculated using OST

  52. Waiting Time for Patterns Fix a pattern “00110” P = How many fair coins one needs to toss to see P for the first time (in expectation)? The number can be calculated using OST Shuo-Yen Robert Li ( 李碩彥 )

  53. Let the pattern P = p 1 p 2 … p k

  54. Let the pattern P = p 1 p 2 … p k We draw a random string B = b 1 b 2 b 3 …

  55. Let the pattern P = p 1 p 2 … p k We draw a random string B = b 1 b 2 b 3 … Imagine for each , there is a gambler j ≥ 1 G j

  56. Let the pattern P = p 1 p 2 … p k We draw a random string B = b 1 b 2 b 3 … Imagine for each , there is a gambler j ≥ 1 G j At time , bets for “ ”. If he wins, he $1 b j = p 1 j G j bets for “ ”, … $2 b j +1 = p 2

  57. Let the pattern P = p 1 p 2 … p k We draw a random string B = b 1 b 2 b 3 … Imagine for each , there is a gambler j ≥ 1 G j At time , bets for “ ”. If he wins, he $1 b j = p 1 j G j bets for “ ”, … $2 b j +1 = p 2 He keeps doubling the money until he loses

  58. The money of is a martingale (w.r.t. ) G j B

  59. The money of is a martingale (w.r.t. ) G j B Let be the money of all gamblers at time X t t

  60. The money of is a martingale (w.r.t. ) G j B Let be the money of all gamblers at time X t t is also a martingale { X t } t ≥ 1

  61. The money of is a martingale (w.r.t. ) G j B Let be the money of all gamblers at time X t t is also a martingale { X t } t ≥ 1 Let be the first time that we meet in P B τ

Recommend


More recommend