stochastic signals overview introduction discrete time
play

Stochastic Signals Overview Introduction Discrete-time stochastic - PowerPoint PPT Presentation

Stochastic Signals Overview Introduction Discrete-time stochastic processes provides a mathematical Definitions framework for working with non-deterministic signals Second order statistics Signals that have an exact functional


  1. Stochastic Signals Overview Introduction • Discrete-time stochastic processes provides a mathematical • Definitions framework for working with non-deterministic signals • Second order statistics • Signals that have an exact functional relationship are often called • Stationarity and ergodicity predictable or deterministic , though some stochastic processes • Random signal variability are predictable • Power spectral density • I’m going to use the term deterministic to refer to signals that are not affected by the outcome of a random experiment • Linear systems with stationary inputs • I will use the terms stochastic process and random process • Random signal memory interchangeably • Correlation matrices J. McNames Portland State University ECE 538/638 Stochastic Signals Ver. 1.10 1 J. McNames Portland State University ECE 538/638 Stochastic Signals Ver. 1.10 2 Probability Space Definitions and Interpretations • Interpretations – Random variable: x ( n, ζ ) with n = n o fixed and ζ treated as x ( n, ζ ) Ω a variable ζ – Sample Sequence: x ( n, ζ ) with ζ = ζ k fixed and n treated as an independent (non-random) variable • Conceptually we should imagine a sample space with some – Number: x ( n, ζ ) with both ζ = ζ k and n = n o fixed number (possibly infinite) of outcomes: Ω = { ζ 1 , ζ 2 , . . . } – Stochastic Process: x ( n, ζ ) with both ζ and n treated as • Each has a probability Pr { ζ k } variables • By some rule, each outcome generates a sequence x ( n, ζ k ) • Realization: a sample sequence • We can think of x ( n, ζ k ) as a vector of (possibly) infinite duration • Ensemble: The set of all possible sequences, { x ( n, ζ ) } • Note that the entire sequence is generated from a single outcome of the underlying experiment • x ( n, ζ ) is called a discrete-time stochastic process or a random sequence J. McNames Portland State University ECE 538/638 Stochastic Signals Ver. 1.10 3 J. McNames Portland State University ECE 538/638 Stochastic Signals Ver. 1.10 4

  2. Probability Functions Second Order Statistics At any time n , we can specify the mean and variance of x ( n ) In order to fully characterize a stochastic process, we must consider the cdf or pdf σ 2 x ( n ) � E[ | x ( n ) − µ x ( n ) | 2 ] µ x ( n ) � E[ x ( n )] F x ( x 1 , . . . , x k ; n 1 , . . . , n k ) = Pr { x ( n 1 ) ≤ x 1 , . . . , x ( n k ) ≤ x k } ∂ k F x ( x 1 , . . . , x k ; n 1 , . . . , n k ) • µ x ( n ) and σ 2 x ( n ) are both deterministic sequences f x ( x 1 , . . . , x k ; n 1 , . . . , n k ) = ∂x 1 . . . ∂x k • The expectation is taken over the ensemble. for every k ≥ 1 and any set of sample times { n 1 , n 2 , . . . , n k } . • In general, the second-order statistics at two different times are given by the autocorrelation or autocovariance sequences. • Without additional sweeping assumptions, estimation of f x ( · ) from a realization is impossible • Autocorrelation Sequence • Many stochastic processes can be characterized accurately or, at r xx ( n 1 , n 2 ) = E[ x ( n 1 ) x ∗ ( n 2 )] least, usefully by much less information • Autocovariance Sequence • To simplify notation, from here on will mostly use x ( n ) to denote E[( x ( n 1 ) − µ x ( n 1 )) ( x ( n 2 ) − µ x ( n 2 )) ∗ ] γ xx ( n 1 , n 2 ) = both random processes and single realizations r xx ( n 1 , n 2 ) − µ x ( n 1 ) µ ∗ = x ( n 2 ) • In most cases will assume x ( n ) is complex valued J. McNames Portland State University ECE 538/638 Stochastic Signals Ver. 1.10 5 J. McNames Portland State University ECE 538/638 Stochastic Signals Ver. 1.10 6 Cross-Correlation and Cross-Covariance More Definitions Cross-Correlation • Independent: iff r xy ( n 1 , n 2 ) = E[ x ( n 1 ) y ∗ ( n 2 )] k � f x ( x 1 , . . . , x k ; n 1 , . . . , n k ) = f ℓ ( x ℓ , ; n ℓ ) ∀ k Cross-Covariance ℓ =1 • Uncorrelated: if ( x ( n 1 ) − µ x ( n 1 )) ( y ( n 2 ) − µ y ( n 2 )) ∗ � � γ xy ( n 1 , n 2 ) = E r xy ( n 1 , n 2 ) − µ x ( n 1 ) µ ∗ � = y ( n 2 ) σ 2 x ( n 1 ) n 1 = n 2 γ x ( n 1 , n 2 ) = 0 n 1 � = n 2 Normalized Cross-Correlation γ xy ( n 1 , n 2 ) • Orthogonal: if ρ xy ( n 1 , n 2 ) = σ x ( n 1 ) σ y ( n 2 ) � σ 2 x ( n 1 ) + | µ x ( n 1 ) | 2 n 1 = n 2 r x ( n 1 , n 2 ) = 0 n 1 � = n 2 J. McNames Portland State University ECE 538/638 Stochastic Signals Ver. 1.10 7 J. McNames Portland State University ECE 538/638 Stochastic Signals Ver. 1.10 8

  3. Still More Definitions Stationarity • Wide-sense Periodic: if Stationarity of Order N : A stochastic process x ( n ) such that µ x ( n ) = µ x ( n + N ) ∀ n f x ( x 1 , . . . , x N ; n 1 , . . . , n N ) = f x ( x 1 , . . . , x N ; n 1 + k, . . . , n N + k ) r x ( n 1 , n 2 ) = r x ( n 1 + N, n 2 ) = r x ( n 1 , n 2 + N ) for any value for any k . = r x ( n 1 + N, n 2 + N ) • Any stochastic process of Order N , is also a stochastic process of • Statistically Independent: iff for every n 1 and n 2 order M for all M ≤ N • Strict-Sense Stationary (SSS): A stochastic process that is f xy ( x, y ; n 1 , n 2 ) = f x ( x ; n 1 ) f y ( y ; n 2 ) stationary of all orders N • Uncorrelated: if for every n 1 and n 2 , γ xy ( n 1 , n 2 ) = 0 • Orthogonal: if for every n 1 and n 2 r xy ( n 1 , n 2 ) = 0 J. McNames Portland State University ECE 538/638 Stochastic Signals Ver. 1.10 9 J. McNames Portland State University ECE 538/638 Stochastic Signals Ver. 1.10 10 Wide Sense Stationary Example 1: Stationarity Describe a random process that is stationary. Describe a second f x ( x 1 , x 2 ; n 1 , n 2 ) = f x ( x 1 , x 2 ; n 1 + k, n 2 + k ) random process that is not stationary. • Wide-Sense Stationary (WSS): A stochastic process with a constant mean and autocorrelation that only depends on the delay between the two sample times • WSS Properties E[ x ( n )] = µ x r x ( n 1 , n 2 ) = r x ( ℓ ) = r x ( n 1 − n 2 ) = E[ x ( n + ℓ ) x ∗ ( n )] γ x ( ℓ ) = r x ( ℓ ) − | µ x | 2 • This implies the variance is also constant, var[ x ( n )] = σ 2 x • All processes that are stationary of order 2 are WSS • Not all WSS processes are stationary of order 2 • Note this is slightly different from the text J. McNames Portland State University ECE 538/638 Stochastic Signals Ver. 1.10 11 J. McNames Portland State University ECE 538/638 Stochastic Signals Ver. 1.10 12

  4. Stationarity Notes Autocorrelation Sequence Properties • SSS implies WSS σ 2 x + | µ x | 2 r x (0) = • If the marginal pdf of a signal is Guassian for all n , then WSS r x (0) ≥ | r x ( ℓ ) | implies SSS r ∗ r x ( ℓ ) = x ( − ℓ ) • The book states that most WSS processes are SSS. True? M M � � • Jointly Wide-Sense Stationary: two random signals x ( n ) and α k r x ( k − m ) α ∗ ≥ 0 ∀ α sequences m y ( n ) are jointly WSS if they are both WSS and k =1 m =1 r xy ( ℓ ) = r xy ( n 1 − n 2 ) = r xy ( ℓ ) = E[ x ( n ) y ∗ ( n − ℓ )] • Average DC Power: | µ x | 2 γ xy ( ℓ ) = γ xy ( n 1 − n 2 ) = γ xy ( ℓ ) = r xy ( ℓ ) − µ x µ ∗ • Average AC Power: σ 2 y x • Nonnegative Definite : A sequence is said to be nonnegative • WSS is a very useful property because it enables us to consider a definite if it satisfies this last property spectral description • Positive Definite: Any sequence that satisfies the last inequality • In practice, we only need the signal to be WSS long enough to strictly for any α estimate the autocorrelation or cross-correlation J. McNames Portland State University ECE 538/638 Stochastic Signals Ver. 1.10 13 J. McNames Portland State University ECE 538/638 Stochastic Signals Ver. 1.10 14 Comments on Stationarity Introduction to Ergodicity • Many real processes are nonstationary • In most practical situations we can only observe one or a few realizations • Best case: can determine from domain knowledge of the process • If the process is ergodic , we can know all statistical information • Else: must rely on statistical methods from a single realization • Many nonstationary processes are approximately locally stationary • Ensemble Averages: Repeat the experiment many times (stationary over short periods of time) • Time Averages: • Much of time-frequency analysis is dedicated to this type of signal N • There is no general mathematical framework for analyzing 1 � � ( · ) � � lim ( · ) nonstationary signals 2 N + 1 N →∞ n = − N • However, many nonstationary stochastic processes can be understood through linear estimation (i.e., Kalman filters) • Note that nonstationary is a negative definition: not stationary J. McNames Portland State University ECE 538/638 Stochastic Signals Ver. 1.10 15 J. McNames Portland State University ECE 538/638 Stochastic Signals Ver. 1.10 16

Recommend


More recommend