continuous time markov chains
play

Continuous-time Markov Chains Gonzalo Mateos Dept. of ECE and - PowerPoint PPT Presentation

Continuous-time Markov Chains Gonzalo Mateos Dept. of ECE and Goergen Institute for Data Science University of Rochester gmateosb@ece.rochester.edu http://www.ece.rochester.edu/~gmateosb/ October 14, 2020 Introduction to Random Processes


  1. Continuous-time Markov Chains Gonzalo Mateos Dept. of ECE and Goergen Institute for Data Science University of Rochester gmateosb@ece.rochester.edu http://www.ece.rochester.edu/~gmateosb/ October 14, 2020 Introduction to Random Processes Continuous-time Markov Chains 1

  2. Exponential random variables Exponential random variables Counting processes and definition of Poisson processes Properties of Poisson processes Introduction to Random Processes Continuous-time Markov Chains 2

  3. Exponential distribution ◮ Exponential RVs often model times at which events occur ⇒ Or time elapsed between occurrence of random events ◮ RV T ∼ exp( λ ) is exponential with parameter λ if its pdf is f T ( t ) = λ e − λ t , for all t ≥ 0 ◮ Cdf, integral of the pdf, is ⇒ F T ( t ) = P ( T ≤ t ) = 1 − e − λ t ⇒ Complementary (c)cdf is ⇒ P ( T ≥ t ) = 1 − F T ( t ) = e − λ t pdf ( λ = 1) cdf ( λ = 1) 1 1 0.8 0.8 0.6 0.6 0.4 0.4 0.2 0.2 0 0 0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 Introduction to Random Processes Continuous-time Markov Chains 3

  4. Expected value ◮ Expected value of time T ∼ exp( λ ) is � ∞ � ∞ ∞ � e − λ t dt = 0 + 1 t λ e − λ t dt = − te − λ t � E [ T ] = + � λ 0 � 0 0 ⇒ Integrated by parts with u = t , dv = λ e − λ t dt ◮ Mean time is inverse of parameter λ ⇒ λ is rate/frequency of events happening at intervals T ⇒ Interpret: Average of λ t events by time t ◮ Bigger λ ⇒ smaller expected times, larger frequency of events T 1 T 2 T 3 T 4 T 5 T 6 T 7 T 8 T 9 T 10 � � � � � � � � � � � � � t � � � � � � � � � � � � � � � � S 1 S 2 S 3 S 4 S 5 S 6 S 7 S 8 S 9 S 10 � � � � � � t = 5 /λ t = 10 /λ t = 0 Introduction to Random Processes Continuous-time Markov Chains 4

  5. Second moment and variance ◮ For second moment also integrate by parts ( u = t 2 , dv = λ e − λ t dt ) � ∞ � ∞ ∞ � T 2 � t 2 λ e − λ t dt = − t 2 e − λ t � � 2 te − λ t dt = + E � 0 � 0 0 ◮ First term is 0, second is (2 /λ ) E [ T ] � ∞ = 2 t λ e − λ t = 2 T 2 � � E λ 2 λ 0 ◮ The variance is computed from the mean and second moment − E 2 [ T ] = 2 λ 2 − 1 λ 2 = 1 T 2 � � var [ T ] = E λ 2 ⇒ Parameter λ controls mean and variance of exponential RV Introduction to Random Processes Continuous-time Markov Chains 5

  6. Memoryless random times ◮ Def: Consider random time T . We say time T is memoryless if � � T > t � � P T > s + t = P ( T > s ) ◮ Probability of waiting s extra units of time (e.g., seconds) given that we waited t seconds, is just the probability of waiting s seconds ⇒ System does not remember it has already waited t seconds ⇒ Same probability irrespectively of time already elapsed Ex: Chemical reaction A + B → AB occurs when molecules A and B “collide”. A , B move around randomly. Time T until reaction Introduction to Random Processes Continuous-time Markov Chains 6

  7. Exponential RVs are memoryless ◮ Write memoryless property in terms of joint pdf = P ( T > s + t , T > t ) � � T > t � � P T > s + t = P ( T > s ) P ( T > t ) ◮ Notice event { T > s + t , T > t } is equivalent to { T > s + t } ⇒ Replace P ( T > s + t , T > t ) = P ( T > s + t ) and reorder P ( T > s + t ) = P ( T > t )P ( T > s ) ◮ If T ∼ exp( λ ), ccdf is P ( T > t ) = e − λ t so that P ( T > s + t ) = e − λ ( s + t ) = e − λ t e − λ s = P ( T > t ) P ( T > s ) ◮ If random time T is exponential ⇒ T is memoryless Introduction to Random Processes Continuous-time Markov Chains 7

  8. Continuous memoryless RVs are exponential ◮ Consider a function g ( t ) with the property g ( t + s ) = g ( t ) g ( s ) ◮ Q: Functional form of g ( t )? Take logarithms log g ( t + s ) = log g ( t ) + log g ( s ) ⇒ Only holds for all t and s if log g ( t ) = ct for some constant c ⇒ Which in turn, can only hold if g ( t ) = e ct for some constant c ◮ Compare observation with statement of memoryless property P ( T > s + t ) = P ( T > t ) P ( T > s ) ⇒ It must be P ( T > t ) = e ct for some constant c ◮ T continuous: only true for exponential T ∼ exp( − c ) ◮ T discrete: only geometric P ( T > t ) = (1 − p ) t with (1 − p ) = e c ◮ If continuous random time T is memoryless ⇒ T is exponential Introduction to Random Processes Continuous-time Markov Chains 8

  9. Main memoryless property result Theorem A continuous random variable T is memoryless if and only if it is exponentially distributed. That is � T > t � � � P T > s + t = P ( T > s ) if and only if f T ( t ) = λ e − λ t for some λ > 0 ◮ Exponential RVs are memoryless. Do not remember elapsed time ⇒ Only type of continuous memoryless RVs ◮ Discrete RV T is memoryless if and only of it is geometric ⇒ Geometrics are discrete approximations of exponentials ⇒ Exponentials are continuous limits of geometrics ◮ Exponential = time until success ⇔ Geometric = nr. trials until success Introduction to Random Processes Continuous-time Markov Chains 9

  10. Exponential times example ◮ First customer’s arrival to a store takes T ∼ exp(1 / 10) minutes ⇒ Suppose 5 minutes have passed without an arrival ◮ Q: How likely is it that the customer arrives within the next 3 mins.? ◮ Use memoryless property of exponential T � T > 5 � T > 5 � � � � � � P T ≤ 8 = 1 − P T > 8 = 1 − P ( T > 3) = 1 − e − 3 λ = 1 − e − 0 . 3 ◮ Q: How likely is it that the customer arrives after time T = 9? = P ( T > 4) = e − 4 λ = e − 0 . 4 � T > 5 � � � P T > 9 ◮ Q: What is the expected additional time until the first arrival? � T > 5 ◮ Additional time is T − 5, and P � � � T − 5 > t = P ( T > t ) � � T > 5 � � E T − 5 = E [ T ] = 1 /λ = 10 Introduction to Random Processes Continuous-time Markov Chains 10

  11. Time to first event ◮ Independent exponential RVs T 1 , T 2 with parameters λ 1 , λ 2 ◮ Q: Prob. distribution of time to first event, i.e., T := min( T 1 , T 2 )? ⇒ To have T > t we need both T 1 > t and T 2 > t ◮ Using independence of T 1 and T 2 we can write P ( T > t ) = P ( T 1 > t , T 2 > t ) = P ( T 1 > t ) P ( T 2 > t ) ◮ Substituting expressions of exponential ccdfs P ( T > t ) = e − λ 1 t e − λ 2 t = e − ( λ 1 + λ 2 ) t ◮ T := min( T 1 , T 2 ) is exponentially distributed with parameter λ 1 + λ 2 ◮ In general, for n independent RVs T i ∼ exp( λ i ) define T := min i T i ⇒ T is exponentially distributed with parameter � n i =1 λ i Introduction to Random Processes Continuous-time Markov Chains 11

  12. First event to happen ◮ Q: Prob. P ( T 1 < T 2 ) of T 1 ∼ exp( λ 1 ) happening before T 2 ∼ exp( λ 2 )? ◮ Condition on T 2 = t , integrate over the pdf of T 2 , independence � ∞ � ∞ � T 2 = t � � � P ( T 1 < T 2 ) = P T 1 < t f T 2 ( t ) dt = F T 1 ( t ) f T 2 ( t ) dt 0 0 ◮ Substitute expressions for exponential pdf and cdf � ∞ λ 1 (1 − e − λ 1 t ) λ 2 e − λ 2 t dt = P ( T 1 < T 2 ) = λ 1 + λ 2 0 ◮ Either T 1 comes before T 2 or vice versa, hence λ 2 P ( T 2 < T 1 ) = 1 − P ( T 1 < T 2 ) = λ 1 + λ 2 ⇒ Probabilities are relative values of rates (parameters) ◮ Larger rate ⇒ smaller average ⇒ higher prob. happening first Introduction to Random Processes Continuous-time Markov Chains 12

  13. Additional properties of exponential RVs ◮ Consider n independent RVs T i ∼ exp( λ i ). T i time to i -th event ◮ Probability of j -th event happening first λ j � � P T j = min T i = , j = 1 , . . . , n � n i =1 λ i i ◮ Time to first event and rank ordering of events are independent � � � � P min T i ≥ t , T i 1 < . . . < T i n = P min T i ≥ t P ( T i 1 < . . . < T i n ) i i ◮ Suppose T ∼ exp( λ ), independent of non-negative RV X ◮ Strong memoryless property asserts � T > X � � � P T > X + t = P ( T > t ) ⇒ Also forgets random but independent elapsed times Introduction to Random Processes Continuous-time Markov Chains 13

  14. Strong memoryless property example ◮ Independent customer arrival times T i ∼ exp( λ i ), i = 1 , . . . , 3 ⇒ Suppose customer 3 arrives first, i.e., min( T 1 , T 2 ) > T 3 ◮ Q: Probability that time between first and second arrival exceeds t ? ◮ We want to compute � min( T 1 , T 2 ) > T 3 � � � P min( T 1 , T 2 ) − T 3 > t ◮ Denote T i 2 := min( T 1 , T 2 ) the time to second arrival ⇒ Recall T i 2 ∼ exp( λ 1 + λ 2 ), T i 2 independent of T i 1 = T 3 ◮ Apply the strong memoryless property � � T i 2 > T 3 = P ( T i 2 > t ) = e − ( λ 1 + λ 2 ) t � � P T i 2 − T 3 > t Introduction to Random Processes Continuous-time Markov Chains 14

Recommend


More recommend