Persistence of Gaussian Stationary Processes Ohad Feldheim (Stanford) Joint work with Naomi Feldheim (Stanford) Shahaf Nitzan (GeorgiaTech) UBC, Vancouver January, 2017
Gaussian stationary processes (GSP) For T ∈ { R , Z } , a random function f : T �→ R is a GSP if it is Gaussian: ( f ( x 1 ) ,... f ( x N )) ∼ N R N (0 , Σ x 1 ,..., x N ), Stationary (shift-invariant): ( f ( x 1 + s ) ,... f ( x N + s )) d ∼ ( f ( x 1 ) ,... f ( x N )), for all N ∈ N , x 1 ,..., x N , s ∈ T .
Gaussian stationary processes (GSP) For T ∈ { R , Z } , a random function f : T �→ R is a GSP if it is Gaussian: ( f ( x 1 ) ,... f ( x N )) ∼ N R N (0 , Σ x 1 ,..., x N ), Stationary (shift-invariant): ( f ( x 1 + s ) ,... f ( x N + s )) d ∼ ( f ( x 1 ) ,... f ( x N )), for all N ∈ N , x 1 ,..., x N , s ∈ T . Motivation: Background noise for radio / cellular transmissions Ocean waves Vibrations of bridge strings / membranes Brain transmissions internet / car traffic ...
Gaussian stationary processes (GSP) For T ∈ { R , Z } , a random function f : T �→ R is a GSP if it is Gaussian: ( f ( x 1 ) ,... f ( x N )) ∼ N R N (0 , Σ x 1 ,..., x N ), Stationary (shift-invariant): ( f ( x 1 + s ) ,... f ( x N + s )) d ∼ ( f ( x 1 ) ,... f ( x N )), for all N ∈ N , x 1 ,..., x N , s ∈ T . Covariance function r ( s , t ) = E ( f ( s ) f ( t )) = r ( s − t ) t , s ∈ T .
Gaussian stationary processes (GSP) For T ∈ { R , Z } , a random function f : T �→ R is a GSP if it is Gaussian: ( f ( x 1 ) ,... f ( x N )) ∼ N R N (0 , Σ x 1 ,..., x N ), Stationary (shift-invariant): ( f ( x 1 + s ) ,... f ( x N + s )) d ∼ ( f ( x 1 ) ,... f ( x N )), for all N ∈ N , x 1 ,..., x N , s ∈ T . Covariance function r ( s , t ) = E ( f ( s ) f ( t )) = r ( s − t ) t , s ∈ T . Spectral measure By Bochner’s theorem there exists a finite, non-negative, symmetric measure ρ over T ∗ ( Z ∗ ≃ [ − π,π ] and R ∗ ≃ R ) s.t. � e − i λ t d ρ ( λ ) . r ( t ) = � ρ ( t ) = T ∗
Gaussian stationary processes (GSP) For T ∈ { R , Z } , a random function f : T �→ R is a GSP if it is Gaussian: ( f ( x 1 ) ,... f ( x N )) ∼ N R N (0 , Σ x 1 ,..., x N ), Stationary (shift-invariant): ( f ( x 1 + s ) ,... f ( x N + s )) d ∼ ( f ( x 1 ) ,... f ( x N )), for all N ∈ N , x 1 ,..., x N , s ∈ T . Covariance function r ( s , t ) = E ( f ( s ) f ( t )) = r ( s − t ) t , s ∈ T . Spectral measure By Bochner’s theorem there exists a finite, non-negative, symmetric measure ρ over T ∗ ( Z ∗ ≃ [ − π,π ] and R ∗ ≃ R ) s.t. � e − i λ t d ρ ( λ ) . r ( t ) = � ρ ( t ) = T ∗ � | λ | δ d ρ ( λ ) < ∞ for some δ > 0. Assumption: (“finite polynomial moment” ⇒ r is Hölder contin.)
Toy-Example Ia - Gaussian wave ζ j i.i.d. N (0 , 1) Covariance Kernel f ( x ) = ζ 0 sin( x )+ ζ 1 cos( x ) 1 0.8 r ( x ) = cos( x ) 0.6 0.4 ρ = 1 2 ( δ 1 + δ − 1 ) 0.2 0 −0.2 Three Sample Paths −0.4 −0.6 1 −0.8 −1 0.8 −10 −5 0 5 10 0.6 Spectral Measure 0.4 0.5 0.2 0.45 0 0.4 0.35 −0.2 0.3 −0.4 0.25 0.2 −0.6 0.15 −0.8 0.1 0.05 −1 0 1 2 3 4 5 6 7 8 9 10 0 −2 −1.5 −1 −0.5 0 0.5 1 1.5 2
Toy-Example Ib - Almost periodic wave f ( x ) = ζ 0 sin( x )+ ζ 1 cos( x ) Covariance Kernel √ √ 1 + ζ 2 sin( 2 x )+ ζ 3 cos( 2 x ) 0.8 √ 0.6 r ( x ) =cos( x )+ cos( 2 x ) 0.4 � � 0.2 ρ = 1 δ 1 + δ − 1 + δ √ √ 2 + δ − 0 2 2 −0.2 −0.4 −0.6 Three Sample Paths −0.8 −1 2.5 −10 −8 −6 −4 −2 0 2 4 6 8 10 2 Spectral Measure 1.5 0.5 1 0.45 0.5 0.4 0.35 0 0.3 −0.5 0.25 −1 0.2 0.15 −1.5 0.1 −2 0.05 −2.5 0 0 1 2 3 4 5 6 7 8 9 10 −2 −1.5 −1 −0.5 0 0.5 1 1.5 2
Example II - i.i.d. sequence f ( n ) = ζ n 1 0.8 0.6 r ( n ) = δ n , 0 0.4 0.2 d ρ ( λ ) = 1 2 π 1 I [ − π,π ] ( λ ) d λ 0 −0.2 −0.4 −0.6 Three Sample Paths −0.8 2 −1 −5 −4 −3 −2 −1 0 1 2 3 4 5 1.5 0.2 0.18 1 0.16 0.14 0.5 0.12 0.1 0 0.08 0.06 −0.5 0.04 −1 0.02 0 −5 −4 −3 −2 −1 0 1 2 3 4 5 −1.5 0 1 2 3 4 5 6 7 8 9 10
Example IIb - Sinc kernel � f ( n ) = ζ n sinc( x − n ) n ∈ N 1 r ( n ) = sin( π x ) 0.8 = sinc( x ) π x 0.6 d ρ ( λ ) = 1 0.4 2 π 1 I [ − π,π ] ( λ ) d λ 0.2 0 Three Sample Paths −0.2 2 −0.4 −5 −4 −3 −2 −1 0 1 2 3 4 5 1.5 0.2 0.18 1 0.16 0.14 0.5 0.12 0.1 0 0.08 0.06 −0.5 0.04 −1 0.02 0 −5 −4 −3 −2 −1 0 1 2 3 4 5 −1.5 0 1 2 3 4 5 6 7 8 9 10
Example III - Gaussian Covariance (Fock-Bargmann) � ζ n x n e − x 2 √ f ( x ) = Covariance Kernel 2 n ! 1 n ∈ N 0.9 r ( x ) = e − x 2 0.8 2 0.7 0.6 d ρ ( λ ) = √ π e − λ 2 0.5 2 d λ 0.4 0.3 0.2 Three Sample Paths 0.1 0 −2 −1.5 −1 −0.5 0 0.5 1 1.5 2 3 2 Spectral Measure 1 1.8 1.6 1.4 0 1.2 1 −1 0.8 0.6 −2 0.4 0.2 −3 0 0 1 2 3 4 5 6 7 8 9 10 −5 −4 −3 −2 −1 0 1 2 3 4 5
Example IV - Exponential Covariance (Ornstein-Uhlenbeck) r ( x ) = e −| x | Covariance Kernel 1 2 0.9 d ρ ( λ ) = λ 2 + 1 d λ 0.8 0.7 0.6 0.5 0.4 0.3 Three Sample Paths 0.2 0.1 3 0 −2 −1.5 −1 −0.5 0 0.5 1 1.5 2 2 Spectral Measure 1 2 1.8 1.6 0 1.4 1.2 −1 1 0.8 0.6 −2 0.4 0.2 −3 0 1 2 3 4 5 6 7 8 9 10 0 −5 −4 −3 −2 −1 0 1 2 3 4 5
Persistence Probability Persistence The persistence probability of a stochastic process f over a level ℓ ∈ R in the time interval (0 , N ] is: � � P f ( N ) := P f ( x ) > ℓ, ∀ x ∈ (0 , N ] . Picture of persistence
Persistence Probability Persistence (above the mean) The persistence probability of a centered stochastic process f in the time interval (0 , N ] is: � � P f ( N ) := P f ( x ) > 0 , ∀ x ∈ (0 , N ] . Picture of persistence
Persistence Probability Persistence (above the mean) The persistence probability of a centered stochastic process f in the time interval (0 , N ] is: � � P f ( N ) := P f ( x ) > 0 , ∀ x ∈ (0 , N ] . Picture of persistence Question: For a GSP f , what is the behavior of P f ( N ) as N → ∞ ? Guess: “typically” P ( t ) ≍ e − θ t .
Persistence Probability Persistence (above the mean) The persistence probability of a centered stochastic process f in the time interval (0 , N ] is: � � P f ( N ) := P f ( x ) > 0 , ∀ x ∈ (0 , N ] . Picture of persistence Question: For a GSP f , what is the behavior of P f ( N ) as N → ∞ ? Guess: “typically” P ( t ) ≍ e − θ t . Toy Examples P X ( N ) = 2 − N ( X n ) n ∈ Z i.i.d. ( N +1)! ≍ e − N log N 1 Y n = X n +1 − X n P Y ( N ) = P Z ( N ) = P ( Z 0 > 0) = 1 Z n ≡ Z 0 2 .
History and Motivation Engineering and Applied Mathematics (1940–1970) 1944 Rice - “Mathematical Analysis of Random Noise” . Mean number of level-crossings (Rice formula) Behavior of P ( t ) for t ≪ 1 ( short range ). 1962 Slepian - “One-sided barrier problem” . Slepian’s Inequality: r 1 ( x ) ≥ r 2 ( x ) ⇒ P 1 ( N ) ≥ P 2 ( N ) . specific cases 1962 Newell & Rosenblatt If r ( x ) → 0 as x → ∞ , then P ( N ) = o ( N − α ) for any α > 0. � e − CN if α > 1 If | r ( x ) | < ax − α then P ( N ) ≤ e − CN / log N if α = 1 e − CN α if 0 < α < 1 √ N log N ≫ e − CN ( r ( x ) ≍ x − 1 / 2 ). examples for P ( t ) > e − C
History and Motivation Engineering and Applied Mathematics (1940–1970) 1944 Rice - “Mathematical Analysis of Random Noise” . Mean number of level-crossings (Rice formula) Behavior of P ( t ) for t ≪ 1 ( short range ). 1962 Slepian - “One-sided barrier problem” . Slepian’s Inequality: r 1 ( x ) ≥ r 2 ( x ) ⇒ P 1 ( N ) ≥ P 2 ( N ) . specific cases 1962 Newell & Rosenblatt If r ( x ) → 0 as x → ∞ , then P ( N ) = o ( N − α ) for any α > 0. � e − CN if α > 1 If | r ( x ) | < ax − α then P ( N ) ≤ e − CN / log N if α = 1 e − CN α if 0 < α < 1 √ N log N ≫ e − CN ( r ( x ) ≍ x − 1 / 2 ). examples for P ( t ) > e − C There are parallel independent results from the Soviet Union (e.g. Piterbarg, Kolmogorov).
History and Motivation Engineering and Applied Mathematics (1940–1970) 1944 Rice - “Mathematical Analysis of Random Noise” . Mean number of level-crossings (Rice formula) Behavior of P ( t ) for t ≪ 1 ( short range ). 1962 Slepian - “One-sided barrier problem” . Slepian’s Inequality: r 1 ( x ) ≥ r 2 ( x ) ⇒ P 1 ( N ) ≥ P 2 ( N ) . specific cases 1962 Newell & Rosenblatt If r ( x ) → 0 as x → ∞ , then P ( N ) = o ( N − α ) for any α > 0. � e − CN if α > 1 If | r ( x ) | < ax − α then P ( N ) ≤ e − CN / log N if α = 1 e − CN α if 0 < α < 1 √ N log N ≫ e − CN ( r ( x ) ≍ x − 1 / 2 ). examples for P ( t ) > e − C There are parallel independent results from the Soviet Union (e.g. Piterbarg, Kolmogorov). Applicable mainly when r is non-negative or summable.
Recommend
More recommend