random processes
play

Random Processes Saravanan Vijayakumaran sarva@ee.iitb.ac.in - PowerPoint PPT Presentation

Random Processes Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay April 10, 2015 1 / 12 Random Process Definition An indexed collection of random variables { X t : t T


  1. Random Processes Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay April 10, 2015 1 / 12

  2. Random Process Definition An indexed collection of random variables { X t : t ∈ T } . Discrete-time Random Process A random process where the index set T = Z or { 0 , 1 , 2 , 3 , . . . } . Example: Random walk T = { 0 , 1 , 2 , 3 , . . . } , X 0 = 0, X n independent and equally likely to be ± 1 for n ≥ 1 n � S n = X i i = 0 Continuous-time Random Process A random process where the index set T = R or [ 0 , ∞ ) . The notation X ( t ) is used to represent continuous-time random processes. Example: Thermal Noise 2 / 12

  3. Realization of a Random Process • The outcome of an experiment is specified by a sample point ω in the sample space Ω • A realization of a random variable X is its value X ( ω ) • A realization of a random process X t is the function X t ( ω ) of t • A realization is also called a sample function of the random process. Example Consider Ω = [ 0 , 1 ] . For each ω ∈ Ω , consider its dyadic expansion ∞ d n ( ω ) � ω = = 0 . d 1 ( ω ) d 2 ( ω ) d 3 ( ω ) · · · 2 n n = 1 where each d n ( ω ) is either 0 or 1. An infinite number of coin tosses with Heads being 0 and Tails being 1 can be associated with each ω as X n ( ω ) = d n ( ω ) For each ω ∈ Ω , we get a realization of this random process. 3 / 12

  4. Specification of a Random Process • A random process is specified by the joint cumulative distribution of the random variables X ( t 1 ) , X ( t 2 ) , . . . , X ( t n ) for any set of sample times { t 1 , t 2 , . . . , t n } and any n ∈ N F X ( t 1 ) , X ( t 2 ) ,..., X ( t n ) ( x 1 , x 2 , . . . , x n ) = Pr [ X ( t 1 ) ≤ x 1 , X ( t 2 ) ≤ x 2 , . . . , X ( t n ) ≤ x n ] • For continuous-time random processes, the joint probability density is sufficient • For discrete-time random processes, the joint probability mass function is sufficient • Without additional restrictions, this requires specifying a lot of joint distributions • One restriction which simplifies process specification is stationarity 4 / 12

  5. Stationary Random Process Definition A random process X ( t ) is said to be stationary in the strict sense or strictly stationary if the joint distribution of X ( t 1 ) , X ( t 2 ) , . . . , X ( t k ) is the same as the joint distribution of X ( t 1 + τ ) , X ( t 2 + τ ) , . . . , X ( t k + τ ) for all time shifts τ , all k , and all observation instants t 1 , . . . , t k . F X ( t 1 ) ,..., X ( t k ) ( x 1 , . . . , x k ) = F X ( t 1 + τ ) ,..., X ( t k + τ ) ( x 1 , . . . , x k ) Properties • A stationary random process is statistically indistinguishable from a delayed version of itself. • For k = 1, we have F X ( t ) ( x ) = F X ( t + τ ) ( x ) for all t and τ . The first order distribution is independent of time. • For k = 2 and τ = − t 1 , we have F X ( t 1 ) , X ( t 2 ) ( x 1 , x 2 ) = F X ( 0 ) , X ( t 2 − t 1 ) ( x 1 , x 2 ) for all t 1 and t 2 . The second order distribution depends only on t 2 − t 1 . 5 / 12

  6. Mean Function • The mean of a random process X ( t ) is the expectation of the random variable obtained by observing the process at time t � ∞ µ X ( t ) = E [ X ( t )] = xf X ( t ) ( x ) dx −∞ • For a strictly stationary random process X ( t ) , the mean is a constant µ X ( t ) = µ for all t Example X ( t ) = cos ( 2 π ft + Θ) , Θ ∼ U [ − π, π ] . µ X ( t ) =? Example X n = Z 1 + · · · + Z n , n = 1 , 2 , . . . where Z i are i.i.d. with zero mean and variance σ 2 . µ X ( n ) =? 6 / 12

  7. Autocorrelation Function • The autocorrelation function of a random process X ( t ) is defined as � ∞ � ∞ R X ( t 1 , t 2 ) = E [ X ( t 1 ) X ( t 2 )] = x 1 x 2 f X ( t 1 ) , X ( t 2 ) ( x 1 , x 2 ) dx 1 dx 2 −∞ −∞ • For a strictly stationary random process X ( t ) , the autocorrelation function depends only on the time difference t 2 − t 1 R X ( t 1 , t 2 ) = R X ( 0 , t 2 − t 1 ) for all t 1 , t 2 In this case, R X ( 0 , t 2 − t 1 ) is simply written as R X ( t 2 − t 1 ) Example X ( t ) = cos ( 2 π ft + Θ) , Θ ∼ U [ − π, π ] . R X ( t 1 , t 2 ) =? Example X n = Z 1 + · · · + Z n , n = 1 , 2 , . . . where Z i are i.i.d. with zero mean and variance σ 2 . R X ( n 1 , n 2 ) =? 7 / 12

  8. Wide-Sense Stationary Random Process Definition A random process X ( t ) is said to be wide-sense stationary or weakly stationary or second-order stationary if µ X ( t ) = µ X ( 0 ) for all t and R X ( t 1 , t 2 ) = R X ( t 1 − t 2 , 0 ) for all t 1 , t 2 . Remarks • A strictly stationary random process is also wide-sense stationary if the first and second order moments exist. • A wide-sense stationary random process need not be strictly stationary. Example Is the following random process wide-sense stationary? X ( t ) = A cos ( 2 π f c t + Θ) where A and f c are constants and Θ is uniformly distributed on [ − π, π ] . 8 / 12

  9. Properties of the Autocorrelation Function • Consider the autocorrelation function of a wide-sense stationary random process X ( t ) R X ( τ ) = E [ X ( t + τ ) X ( t )] • R X ( τ ) is an even function of τ R X ( τ ) = R X ( − τ ) • R X ( τ ) has maximum magnitude at τ = 0 | R X ( τ ) | ≤ R X ( 0 ) • The autocorrelation function measures the interdependence of two random variables obtained by measuring X ( t ) at times τ apart 9 / 12

  10. Ergodic Processes • Let X ( t ) be a wide-sense stationary random process with mean µ X and autocorrelation function R X ( τ ) (also called the ensemble averages) • Let x ( t ) be a realization of X ( t ) • For an observation interval [ − T , T ] , the time average of x ( t ) is given by � T µ x ( T ) = 1 x ( t ) dt 2 T − T • The process X ( t ) is said to be ergodic in the mean if µ x ( T ) converges to µ X in the squared mean as T → ∞ • For an observation interval [ − T , T ] , the time-averaged autocorrelation function is given by � T R x ( τ, T ) = 1 x ( t + τ ) x ( t ) dt 2 T − T • The process X ( t ) is said to be ergodic in the autocorrelation function if R x ( τ, T ) converges to R X ( τ ) in the squared mean as T → ∞ 10 / 12

  11. Passing a Random Process through an LTI System X ( t ) LTI System Y ( t ) • Consider a linear time-invariant (LTI) system h ( t ) which has random processes X ( t ) and Y ( t ) as input and output � ∞ Y ( t ) = h ( τ ) X ( t − τ ) d τ −∞ • In general, it is difficult to characterize Y ( t ) in terms of X ( t ) • If X ( t ) is a wide-sense stationary random process, then Y ( t ) is also wide-sense stationary � ∞ µ Y ( t ) = h ( τ ) d τ µ X −∞ � ∞ � ∞ R Y ( τ ) = h ( τ 1 ) h ( τ 2 ) R X ( τ − τ 1 + τ 2 ) d τ 1 d τ 2 −∞ −∞ 11 / 12

  12. Reference • Chapter 1, Communication Systems , Simon Haykin, Fourth Edition, Wiley-India, 2001. 12 / 12

Recommend


More recommend