Persistence of Gaussian Stationary Processes: a spectral perspective Naomi Feldheim (Stanford) Joint work with Ohad Feldheim (Stanford) Shahaf Nitzan (GeorgiaTech) February, 2017
Gaussian stationary processes (GSP) For T ∈ { R , Z } , a random function f : T �→ R is a GSP if it is Gaussian: ( f ( x 1 ) ,... f ( x N )) ∼ N R N (0 , Σ x 1 ,..., x N ), Stationary (shift-invariant): ( f ( x 1 + s ) ,... f ( x N + s )) d ∼ ( f ( x 1 ) ,... f ( x N )), for all N ∈ N , x 1 ,..., x N , s ∈ T .
Gaussian stationary processes (GSP) For T ∈ { R , Z } , a random function f : T �→ R is a GSP if it is Gaussian: ( f ( x 1 ) ,... f ( x N )) ∼ N R N (0 , Σ x 1 ,..., x N ), Stationary (shift-invariant): ( f ( x 1 + s ) ,... f ( x N + s )) d ∼ ( f ( x 1 ) ,... f ( x N )), for all N ∈ N , x 1 ,..., x N , s ∈ T . Motivation: nearly any stationary noise. Field fluctuations Electromagnetic noise Ocean waves Vibrations of strings / membranes Data traffic ...
Gaussian stationary processes (GSP) For T ∈ { R , Z } , a random function f : T �→ R is a GSP if it is Gaussian: ( f ( x 1 ) ,... f ( x N )) ∼ N R N (0 , Σ x 1 ,..., x N ), Stationary (shift-invariant): ( f ( x 1 + s ) ,... f ( x N + s )) d ∼ ( f ( x 1 ) ,... f ( x N )), for all N ∈ N , x 1 ,..., x N , s ∈ T . Covariance function r ( s , t ) = E ( f ( s ) f ( t )) = r ( s − t ) t , s ∈ T . r ( · ) and f ( · ) continuous. Assumption:
Gaussian stationary processes (GSP) For T ∈ { R , Z } , a random function f : T �→ R is a GSP if it is Gaussian: ( f ( x 1 ) ,... f ( x N )) ∼ N R N (0 , Σ x 1 ,..., x N ), Stationary (shift-invariant): ( f ( x 1 + s ) ,... f ( x N + s )) d ∼ ( f ( x 1 ) ,... f ( x N )), for all N ∈ N , x 1 ,..., x N , s ∈ T . Covariance function r ( s , t ) = E ( f ( s ) f ( t )) = r ( s − t ) t , s ∈ T . r ( · ) and f ( · ) continuous. Assumption: Spectral measure r continuous and positive-definite ⇒ there exists a finite, non-negative, symmetric measure ρ over T ∗ ( Z ∗ ≃ [ − π,π ] and R ∗ ≃ R ) s.t. � e − i λ t d ρ ( λ ) . r ( t ) = � ρ ( t ) = T ∗
Toy-Example Ia - Gaussian wave ξ j i.i.d. N (0 , 1) Covariance Kernel f ( x ) = ξ 0 sin( x )+ ξ 1 cos( x ) 1 0.8 r ( x ) = cos( x ) 0.6 0.4 ρ = 1 2 ( δ 1 + δ − 1 ) 0.2 0 −0.2 Three Sample Paths −0.4 −0.6 1 −0.8 −1 0.8 −10 −5 0 5 10 0.6 Spectral Measure 0.4 0.5 0.2 0.45 0 0.4 0.35 −0.2 0.3 −0.4 0.25 0.2 −0.6 0.15 −0.8 0.1 0.05 −1 0 1 2 3 4 5 6 7 8 9 10 0 −2 −1.5 −1 −0.5 0 0.5 1 1.5 2
Toy-Example Ib - Almost periodic wave f ( x ) = ξ 0 sin( x )+ ξ 1 cos( x ) Covariance Kernel √ √ 1 + ξ 2 sin( 2 x )+ ξ 3 cos( 2 x ) 0.8 √ 0.6 r ( x ) =cos( x )+ cos( 2 x ) 0.4 � � 0.2 ρ = 1 δ 1 + δ − 1 + δ √ √ 2 + δ − 0 2 2 −0.2 −0.4 −0.6 Three Sample Paths −0.8 −1 2.5 −10 −8 −6 −4 −2 0 2 4 6 8 10 2 Spectral Measure 1.5 0.5 1 0.45 0.5 0.4 0.35 0 0.3 −0.5 0.25 −1 0.2 0.15 −1.5 0.1 −2 0.05 −2.5 0 0 1 2 3 4 5 6 7 8 9 10 −2 −1.5 −1 −0.5 0 0.5 1 1.5 2
Example II - i.i.d. sequence f ( n ) = ξ n Covariance Kernel 1 0.8 r ( n ) = δ n , 0 0.6 0.4 0.2 d ρ ( λ ) = 1 2 π 1 I [ − π,π ] ( λ ) d λ 0 −0.2 −0.4 −0.6 Three Sample Paths −0.8 2 −1 −5 −4 −3 −2 −1 0 1 2 3 4 5 1.5 Spectral Measure 1 0.2 0.18 0.5 0.16 0.14 0 0.12 0.1 −0.5 0.08 0.06 0.04 −1 0.02 0 −1.5 −5 −4 −3 −2 −1 0 1 2 3 4 5 0 1 2 3 4 5 6 7 8 9 10
Example IIb - Sinc kernel � f ( x ) = ξ n sinc( x − n ) Covariance Kernel n ∈ N 1 r ( x ) = sin( π x ) 0.8 = sinc( x ) π x 0.6 d ρ ( λ ) = 1 0.4 2 π 1 I [ − π,π ] ( λ ) d λ 0.2 0 Three Sample Paths −0.2 2 −0.4 −5 −4 −3 −2 −1 0 1 2 3 4 5 1.5 Spectral Measure 1 0.2 0.18 0.16 0.5 0.14 0 0.12 0.1 −0.5 0.08 0.06 0.04 −1 0.02 0 −1.5 −5 −4 −3 −2 −1 0 1 2 3 4 5 0 1 2 3 4 5 6 7 8 9 10
Example III - Gaussian Covariance (Fock-Bargmann) � ξ n x n e − x 2 √ f ( x ) = Covariance Kernel 2 n ! 1 n ∈ N 0.9 r ( x ) = e − x 2 0.8 2 0.7 0.6 d ρ ( λ ) = √ π e − λ 2 0.5 2 d λ 0.4 0.3 0.2 Three Sample Paths 0.1 0 −2 −1.5 −1 −0.5 0 0.5 1 1.5 2 3 2 Spectral Measure 1 1.8 1.6 1.4 0 1.2 1 −1 0.8 0.6 −2 0.4 0.2 −3 0 0 1 2 3 4 5 6 7 8 9 10 −5 −4 −3 −2 −1 0 1 2 3 4 5
Example IV - Exponential Covariance (Ornstein-Uhlenbeck) r ( x ) = e −| x | Covariance Kernel 1 2 0.9 d ρ ( λ ) = λ 2 + 1 d λ 0.8 0.7 0.6 0.5 0.4 0.3 Three Sample Paths 0.2 0.1 3 0 −2 −1.5 −1 −0.5 0 0.5 1 1.5 2 2 Spectral Measure 2 1 1.8 1.6 0 1.4 1.2 −1 1 0.8 0.6 −2 0.4 0.2 −3 0 1 2 3 4 5 6 7 8 9 10 0 −5 −4 −3 −2 −1 0 1 2 3 4 5
Persistence Probability Persistence The persistence probability of a stochastic process f over a level ℓ ∈ R in the time interval (0 , N ] is:F � � P f ( N ) := P f ( x ) > ℓ, ∀ x ∈ (0 , N ] .
Persistence Probability Persistence (above the mean) The persistence probability of a centered stochastic process f in the time interval (0 , N ] is:F � � P f ( N ) := P f ( x ) > 0 , ∀ x ∈ (0 , N ] .
Persistence Probability Persistence (above the mean) The persistence probability of a centered stochastic process f in the time interval (0 , N ] is:F � � P f ( N ) := P f ( x ) > 0 , ∀ x ∈ (0 , N ] . Question: For a GSP f , what is the behavior of P f ( N ) as N → ∞ ?
Persistence Probability Persistence (above the mean) The persistence probability of a centered stochastic process f in the time interval (0 , N ] is:F � � P f ( N ) := P f ( x ) > 0 , ∀ x ∈ (0 , N ] . Question: For a GSP f , what is the behavior of P f ( N ) as N → ∞ ? Motivation: detection theory.
Persistence Probability Persistence (above the mean) The persistence probability of a centered stochastic process f in the time interval (0 , N ] is:F � � f ( x ) > 0 , ∀ x ∈ (0 , N ] P f ( N ) := P . Question: For a GSP f , what is the behavior of P f ( N ) as N → ∞ ? Guess: with sufficient independence P ( N ) ≍ e − θ N .
Persistence Probability Persistence (above the mean) The persistence probability of a centered stochastic process f in the time interval (0 , N ] is:F � � f ( x ) > 0 , ∀ x ∈ (0 , N ] P f ( N ) := P . Question: For a GSP f , what is the behavior of P f ( N ) as N → ∞ ? Toy Examples P X ( N ) = 2 − N X n i.i.d. ( N +1)! ≍ e − N log N 1 Y n = X n +1 − X n P Y ( N ) = P Z ( N ) = P ( Z 0 > 0) = 1 Z n ≡ Z 0 2 . Guess: with sufficient independence P ( N ) ≍ e − θ N .
Persistence Probability Persistence (above the mean) The persistence probability of a centered stochastic process f in the time interval (0 , N ] is:F � � f ( x ) > 0 , ∀ x ∈ (0 , N ] P f ( N ) := P . Question: For a GSP f , what is the behavior of P f ( N ) as N → ∞ ? Toy Examples P X ( N ) = 2 − N X n i.i.d. ( N +1)! ≍ e − N log N 1 Y n = X n +1 − X n P Y ( N ) = P Z ( N ) = P ( Z 0 > 0) = 1 Z n ≡ Z 0 2 . Guess: with sufficient independence P ( N ) � e − θ N .
History and Motivation Engineering and Applied Mathematics (1940–1970) 1944 Rice - “Mathematical Analysis of Random Noise” . Mean number of level-crossings (Rice formula) Behavior of P ( t ) for t ≪ 1 (short range). 1962 Slepian - “One-sided barrier problem” . Slepian’s Inequality: r 1 ( x ) ≥ r 2 ( x ) ⇒ P 1 ( N ) ≥ P 2 ( N ) . specific cases 1962 Newell & Rosenblatt If r ( x ) → 0 as x → ∞ , then P ( N ) = o ( N − α ) for any α > 0. � − CN if α > 1 If | r ( x ) | < ax − α then log P ( N ) ≤ − CN / log N if α = 1 − CN α if 0 < α < 1 √ N log N ≫ − CN ( r ( x ) ≍ x − 1 / 2 ). examples for log P ( N ) > − C
History and Motivation Engineering and Applied Mathematics (1940–1970) 1944 Rice - “Mathematical Analysis of Random Noise” . Mean number of level-crossings (Rice formula) Behavior of P ( t ) for t ≪ 1 (short range). 1962 Slepian - “One-sided barrier problem” . Slepian’s Inequality: r 1 ( x ) ≥ r 2 ( x ) ⇒ P 1 ( N ) ≥ P 2 ( N ) . specific cases 1962 Newell & Rosenblatt If r ( x ) → 0 as x → ∞ , then P ( N ) = o ( N − α ) for any α > 0. � − CN if α > 1 If | r ( x ) | < ax − α then log P ( N ) ≤ − CN / log N if α = 1 − CN α if 0 < α < 1 √ N log N ≫ − CN ( r ( x ) ≍ x − 1 / 2 ). examples for log P ( N ) > − C There are parallel independent results from the Soviet Union (Piterbarg, Kolmogorov and others).
Recommend
More recommend