stationary processes
play

Stationary Processes Gonzalo Mateos Dept. of ECE and Goergen - PowerPoint PPT Presentation

Stationary Processes Gonzalo Mateos Dept. of ECE and Goergen Institute for Data Science University of Rochester gmateosb@ece.rochester.edu http://www.ece.rochester.edu/~gmateosb/ November 30, 2018 Introduction to Random Processes Stationary


  1. Stationary Processes Gonzalo Mateos Dept. of ECE and Goergen Institute for Data Science University of Rochester gmateosb@ece.rochester.edu http://www.ece.rochester.edu/~gmateosb/ November 30, 2018 Introduction to Random Processes Stationary Processes 1

  2. Stationary random processes Stationary random processes Autocorrelation function and wide-sense stationary processes Fourier transforms Linear time-invariant systems Power spectral density and linear filtering of random processes The matched and Wiener filters Introduction to Random Processes Stationary Processes 2

  3. Stationary random processes ◮ All joint probabilities invariant to time shifts, i.e., for any s P ( X ( t 1 + s ) ≤ x 1 , X ( t 2 + s ) ≤ x 2 , . . . , X ( t n + s ) ≤ x n ) = P ( X ( t 1 ) ≤ x 1 , X ( t 2 ) ≤ x 2 , . . . , X ( t n ) ≤ x n ) ⇒ If above relation holds X ( t ) is called strictly stationary (SS) ◮ First-order stationary ⇒ probs. of single variables are shift invariant P ( X ( t + s ) ≤ x ) = P ( X ( t ) ≤ x ) ◮ Second-order stationary ⇒ joint probs. of pairs are shift invariant P ( X ( t 1 + s ) ≤ x 1 , X ( t 2 + s ) ≤ x 2 ) = P ( X ( t 1 ) ≤ x 1 , X ( t 2 ) ≤ x 2 ) Introduction to Random Processes Stationary Processes 3

  4. Pdfs and moments of stationary processes ◮ For SS process joint cdfs are shift invariant. Hence, pdfs also are f X ( t + s ) ( x ) = f X ( t ) ( x ) = f X (0) ( x ) := f X ( x ) ◮ As a consequence, the mean of a SS process is constant � ∞ � ∞ µ ( t ) := E [ X ( t )] = xf X ( t ) ( x ) dx = xf X ( x ) dx = µ −∞ −∞ ◮ The variance of a SS process is also constant � ∞ � ∞ ( x − µ ) 2 f X ( t ) ( x ) dx = ( x − µ ) 2 f X ( x ) dx = σ 2 var [ X ( t )] := −∞ −∞ ◮ The power (second moment) of a SS process is also constant � ∞ � ∞ x 2 f X ( x ) dx = σ 2 + µ 2 X 2 ( t ) x 2 f X ( t ) ( x ) dx = � � E := −∞ −∞ Introduction to Random Processes Stationary Processes 4

  5. Joint pdfs of stationary processes ◮ Joint pdf of two values of a SS random process f X ( t 1 ) X ( t 2 ) ( x 1 , x 2 ) = f X (0) X ( t 2 − t 1 ) ( x 1 , x 2 ) ⇒ Used shift invariance for shift of t 1 ⇒ Note that t 1 = 0 + t 1 and t 2 = ( t 2 − t 1 ) + t 1 ◮ Result above true for any pair t 1 , t 2 ⇒ Joint pdf depends only on time difference s := t 2 − t 1 ◮ Writing t 1 = t and t 2 = t + s we equivalently have f X ( t ) X ( t + s ) ( x 1 , x 2 ) = f X (0) X ( s ) ( x 1 , x 2 ) = f X ( x 1 , x 2 ; s ) Introduction to Random Processes Stationary Processes 5

  6. Stationary processes and limit distributions ◮ Stationary processes follow the footsteps of limit distributions ◮ For Markov processes limit distributions exist under mild conditions ◮ Limit distributions also exist for some non-Markov processes ◮ Process somewhat easier to analyze in the limit as t → ∞ ⇒ Properties can be derived from the limit distribution ◮ Stationary process ≈ study of limit distribution ⇒ Formally initialize at limit distribution ⇒ In practice results true for time sufficiently large ◮ Deterministic linear systems ⇒ transient + steady-state behavior ⇒ Stationary systems akin to the study of steady-state ◮ But steady-state is in a probabilistic sense (probs., not realizations) Introduction to Random Processes Stationary Processes 6

  7. Autocorrelation and wide-sense stationarity Stationary random processes Autocorrelation function and wide-sense stationary processes Fourier transforms Linear time-invariant systems Power spectral density and linear filtering of random processes The matched and Wiener filters Introduction to Random Processes Stationary Processes 7

  8. Autocorrelation function ◮ From the definition of autocorrelation function we can write � ∞ � ∞ R X ( t 1 , t 2 ) = E [ X ( t 1 ) X ( t 2 )] = x 1 x 2 f X ( t 1 ) X ( t 2 ) ( x 1 , x 2 ) dx 1 dx 2 −∞ −∞ ◮ For SS process f X ( t 1 ) X ( t 2 ) ( · ) depends on time difference only � ∞ � ∞ R X ( t 1 , t 2 ) = x 1 x 2 f X (0) X ( t 2 − t 1 ) ( x 1 , x 2 ) dx 1 dx 2 = E [ X (0) X ( t 2 − t 1 )] −∞ −∞ ⇒ R X ( t 1 , t 2 ) is a function of s = t 2 − t 1 only R X ( t 1 , t 2 ) = R X (0 , t 2 − t 1 ) := R X ( s ) ◮ The autocorrelation function of a SS random process X ( t ) is R X ( s ) ⇒ Variable s denotes a time difference / shift / lag ⇒ R X ( s ) specifies correlation between values X ( t ) spaced s in time Introduction to Random Processes Stationary Processes 8

  9. Autocovariance function ◮ Similarly to autocorrelation, define the autocovariance function as �� X ( t 1 ) − µ ( t 1 ) �� X ( t 2 ) − µ ( t 2 ) �� C X ( t 1 , t 2 ) = E ◮ Expand product to write C X ( t 1 , t 2 ) as C X ( t 1 , t 2 ) = E [ X ( t 1 ) X ( t 2 )] + µ ( t 1 ) µ ( t 2 ) − E [ X ( t 1 )] µ ( t 2 ) − E [ X ( t 2 )] µ ( t 1 ) ◮ For SS process µ ( t 1 ) = µ ( t 2 ) = µ and E [ X ( t 1 ) X ( t 2 )] = R X ( t 2 − t 1 ) C X ( t 1 , t 2 ) = R X ( t 2 − t 1 ) − µ 2 = C X ( t 2 − t 1 ) ⇒ Autocovariance function depends only on the shift s = t 2 − t 1 ◮ We will typically assume that µ = 0 in which case R X ( s ) = C X ( s ) ⇒ If µ � = 0 can study process X ( t ) − µ whose mean is null Introduction to Random Processes Stationary Processes 9

  10. Wide-sense stationary processes ◮ Def: A process is wide-sense stationary (WSS) when its ⇒ Mean is constant ⇒ µ ( t ) = µ for all t ⇒ Autocorrelation is shift invariant ⇒ R X ( t 1 , t 2 ) = R X ( t 2 − t 1 ) ◮ Consequently, autocovariance of WSS process is also shift invariant C X ( t 1 , t 2 ) = E [ X ( t 1 ) X ( t 2 )] + µ ( t 1 ) µ ( t 2 ) − E [ X ( t 1 )] µ ( t 2 ) − E [ X ( t 2 )] µ ( t 1 ) = R X ( t 2 − t 1 ) − µ 2 ◮ Most of the analysis of stationary processes is based on R X ( t 2 − t 1 ) ⇒ Thus, such analysis does not require SS, WSS suffices Introduction to Random Processes Stationary Processes 10

  11. Wide-sense stationarity versus strict stationarity ◮ SS processes have shift-invariant pdfs ⇒ Mean function is constant ⇒ Autocorrelation is shift-invariant ◮ Then, a SS process is also WSS ⇒ For that reason WSS is also called weak-sense stationary ◮ The opposite is obviously not true in general ◮ But if Gaussian, process determined by mean and autocorrelation ⇒ WSS implies SS for Gaussian process ◮ WSS and SS are equivalent for Gaussian processes (More coming) Introduction to Random Processes Stationary Processes 11

  12. Gaussian wide-sense stationary process ◮ WSS Gaussian process X ( t ) with mean 0 and autocorrelation R ( s ) ◮ The covariance matrix for X ( t 1 + s ) , X ( t 2 + s ) , . . . , X ( t n + s ) is R ( t 1 + s , t 1 + s ) R ( t 1 + s , t 2 + s ) R ( t 1 + s , t n + s )   . . . R ( t 2 + s , t 1 + s ) R ( t 2 + s , t 2 + s ) R ( t 2 + s , t n + s )  . . .  C ( t 1 + s , . . . , t n + s ) = . . .  ...  . . .   . . .   R ( t n + s , t 1 + s ) R ( t n + s , t 2 + s ) R ( t n + s , t n + s ) . . . ◮ For WSS process, autocorrelations depend only on time differences R ( t 1 − t 1 ) R ( t 2 − t 1 ) R ( t n − t 1 )   . . . R ( t 1 − t 2 ) R ( t 2 − t 2 ) R ( t n − t 2 ) . . .   C ( t 1 + s , . . . , t n + s ) =  = C ( t 1 , . . . , t n )  . . .  ... . . .   . . .  R ( t 1 − t n ) R ( t 2 − t n ) R ( t n − t n ) . . . ⇒ Covariance matrices C ( t 1 , . . . , t n ) are shift invariant Introduction to Random Processes Stationary Processes 12

  13. Gaussian wide-sense stationary process (continued) ◮ The joint pdf of X ( t 1 + s ) , X ( t 2 + s ) , . . . , X ( t n + s ) is f X ( t 1 + s ) ,..., X ( t n + s ) ( x 1 , . . . , x n ) = N ( 0 , C ( t 1 + s , . . . , t n + s ); [ x 1 , . . . , x n ] T ) ⇒ Completely determined by C ( t 1 + s , . . . , t n + s ) ◮ Since covariance matrix is shift invariant can write f X ( t 1 + s ) ,..., X ( t n + s ) ( x 1 , . . . , x n ) = N ( 0 , C ( t 1 , . . . , t n ); [ x 1 , . . . , x n ] T ) ◮ Expression on the right is the pdf of X ( t 1 ) , X ( t 2 ) , . . . , X ( t n ). Then f X ( t 1 + s ) ,..., X ( t n + s ) ( x 1 , . . . , x n ) = f X ( t 1 ) ,..., X ( t n ) ( x 1 , . . . , x n ) ◮ Joint pdf of X ( t 1 ) , X ( t 2 ) , . . . , X ( t n ) is shift invariant ⇒ Proving that WSS is equivalent to SS for Gaussian processes Introduction to Random Processes Stationary Processes 13

  14. Brownian motion and white Gaussian noise Ex: Brownian motion X ( t ) with variance parameter σ 2 ⇒ Mean function is µ ( t ) = 0 for all t ≥ 0 ⇒ Autocorrelation is R X ( t 1 , t 2 ) = σ 2 min( t 1 , t 2 ) ◮ While the mean is constant, autocorrelation is not shift invariant ⇒ Brownian motion is not WSS (hence not SS) Ex: White Gaussian noise W ( t ) with variance parameter σ 2 ⇒ Mean function is µ ( t ) = 0 for all t ⇒ Autocorrelation is R W ( t 1 , t 2 ) = σ 2 δ ( t 2 − t 1 ) ◮ The mean is constant and the autocorrelation is shift invariant ⇒ White Gaussian noise is WSS ⇒ Also SS because white Gaussian noise is a GP Introduction to Random Processes Stationary Processes 14

  15. Properties of autocorrelation function For WSS processes: (i) The autocorrelation for s = 0 is the power of the process X 2 ( t ) � � R X (0) = E = E [ X ( t ) X ( t + 0)] (ii) The autocorrelation function is symmetric ⇒ R X ( s ) = R X ( − s ) Proof. Commutative property of product and shift invariance of R X ( t 1 , t 2 ) R X ( s ) = R X ( t , t + s ) = E [ X ( t ) X ( t + s )] = E [ X ( t + s ) X ( t )] = R X ( t + s , t ) = R X ( − s ) Introduction to Random Processes Stationary Processes 15

Recommend


More recommend