outlines
play

Outlines Stochastic Process Discrete Time Markov Chain (DTMC) 2 - PowerPoint PPT Presentation

Markov Chains (1) Outlines Stochastic Process Discrete Time Markov Chain (DTMC) 2 Stochastic Process { ( )| } X t t T A stochastic process is a family of random variables , defined on a given probability space, indexed by


  1. Markov Chains (1)

  2. Outlines  Stochastic Process  Discrete Time Markov Chain (DTMC) 2

  3. Stochastic Process  { ( )| } X t t T  A stochastic process is a family of random variables , defined on a given probability space, indexed by parameter t , where t varies over an index set T.  Thus the above family of random variables is a family of functions   { ( , )| , } X t s s S t T .   , ( ) ( , ) t t X s X t s  For a fixed is a random variable (denoted by 1 1 t 1 ( ) ) as s varies over the sample space S . X t 1   ( ) ( , ) s S X t X t s  For a fixed sample point the expression is 1 1 s 1 a single function of time t , called a sample function or a realization of the process. 3

  4. Stochastic Process…  If the sample space of a stochastic process is discrete, then it is called a discrete-state process, often referred to as a chain.  Alternatively if the state space is continuous, then we have a continuous-state process.  Similarly, if the index set T is discrete, then we have a discrete-time process; otherwise we have a continuous- time process.  A discrete-time process is also called a stochastic  sequence and is denoted by . { | } X n T n 4

  5. Stochastic Process…   For a fixed time , the term is a simple random ( ) t t X t 1 1 variable that describe the state of the process at time . t 1  ( )  For a fixed number , the probability of the event x X t x 1 1 1 gives the CDF of the random variable ( ) , denoted by X t 1    ( ; ) ( ) [ ( ) ] F x t F x P X t x 1 1 ( ) 1 1 1 X t 1 ( ; ) F x t is known as the first-order distribution of the  1 1 t  process . { ( )| 0} X t  Given two time instants and , ( ) and ( ) are two t t X t X t 1 2 1 2 random variables on the same probability space. Their joint distribution is known as the second-order distribution of the process and is given by    ( , ; , ) [ ( ) , ( ) ] F x x t t P X t x X t x 5 1 2 1 2 1 1 2 2

  6. Stochastic Process…  In general, we define the n th-order joint distribution of  the stochastic process by ( ), X t t T    ( ; ) [ ( ) ,..., ( ) ] F x t P X t x X t x 1 1 n n  { ( )| }  A stochastic process is said to be stationary X t t T n  in the strict sense if for 1 , its n th-order joint CDF satisfies the condition    ( ; ) ( ; ) F x t F x t x    n n t T for all vectors and , and all scalars such that    n t T i 6

  7. Stochastic Process…   A stochastic process { ( )| } is said to be an X t t T independent process provided its n th-order joint distribution satisfies the condition n n      ( ; ) ( ; ) [ ( ) ] F x t F x t P X t x i i i i   1 1 i i  A renewal process is defined as a discrete-time n  { | 1,2,...} , ,... independent process X where X X 1 2 n are independent identically distributed (i.i.d) , nonnegative random variables. 7

  8. Stochastic Process…  Though the assumption of an independent process considerably simplifies analysis, such an assumption is often unwarranted, and we are forced to consider some sort of dependence among these random variables.  The simplest and the most important type of dependence is the first-order dependence or Markov dependence.   A stochastic process is called a Markov { ( )| } X t t T      ... process if for any t t t t t , the conditional 0 1 2 n ( ) X t distribution of for given values of ( ), ( ),..., ( ) X t X t X t 0 1 n depends only on : ( ) X t n     [ ( ) | ( ) , ( ) ,... ( ) ] P X t x X t x X t x X t x   1 1 0 0 n n n n    [ ( ) | ( ) ] P X t x X t x n n 8

  9. Stochastic Process…  In many problems of interest, the conditional distribution function mentioned in definition of Markov process has the property of invariance with respect to the time origin t n       [ ( ) | ( ) ] [ ( ) | (0) ] P X t x X t x P X t t x X x n n n n  In this case, the Markov chain is said to be (time-) homogeneous.  For a homogeneous Markov chain, the past history of the process is completely summarized in the current state; therefore, the distribution for the time Y the process spends in a given state must be memory less. 9

  10. Discrete Time Markov Chain (DTMC)  We choose to observe the state of a system at a discrete set of time points.  The successive observations define the random at time steps 0, 1, 2, … , n variables , , ,..., X X X X 0 1 2 n  respectively. If , then the state of the system at X j n time step n is j . X is the initial state of the system. The 0 Markov property can then be succinctly stated as        ( | , ,..., ) ( | ) P X i X i X i X i P X i X i     0 0 1 1 1 1 1 1 n n n n n n n n  The above equation implies that given the present state of the system, the future is independent of its past. 10

  11. Discrete Time Markov Chain (DTMC)… ( ) p n denotes the pmf of the random variable  j   ( ) ( ) p n P X j j n  We will only be concerned with homogenous Markov chains. For such chains, we use the following notation to denote n -step transition probabilities.    ( ) ( | ) p n P X k X j  jk m n m (1) p  The one-step transition probabilities are simply jk written as , thus: p jk     (1) ( | ) p p P X k X j  1 jk jk n n 11

  12. Discrete Time Markov Chain (DTMC)… X  The pmf of the random variable , often called the initial 0 probability vector, is specified as  p(0) [ (0), (0),...] p p 0 1  The one-step transition probabilities are compactly specified in the form of a transition probability matrix   . p p p 00 01 02   . p p p      10 11 12 P [ ] p  ij . . . .     . . . .  The entries of the matrix P satisfy the following two properties      0 1, , ;  p i j I and . i I 1, p ij ij  12 j I

Recommend


More recommend