estimating the inter arrival time density of semi markov
play

Estimating the inter-arrival time density of semi-Markov processes - PowerPoint PPT Presentation

Estimating the inter-arrival time density of semi-Markov processes under structural assumptions on the transition distribution Priscilla (Cindy) Greenwood School of Mathematics and Statistics and Mathematical and Computational Sciences Modeling


  1. Estimating the inter-arrival time density of semi-Markov processes under structural assumptions on the transition distribution Priscilla (Cindy) Greenwood School of Mathematics and Statistics and Mathematical and Computational Sciences Modeling Center Arizona State University jointly with Anton Schick (Binghamton University) and Wolfgang Wefelmeyer (University of Cologne)

  2. Let ( X 0 , T 0 ) , . . . , ( X n , T n ) be observations of a Markov renewal pro- cess with real state space. A nonparametric estimator for the sta- tionary density ̺ ( v ) at v of the inter-arrival times T j − T j − 1 is n ̺ ( v ) = 1 � ˆ k b ( v − ( T j − T j − 1 )) with k b ( v ) = k ( v/b ) /b. n j =1 Suppose that the inter-arrival times T j − T j − 1 depend multiplica- tively on the jump size of the embedded Markov chain: Z j = | X j − X j − 1 | ν , T j − T j − 1 = Z j W j with where ν > 0 and the W j ’s are i.i.d. and independent of the X j ’s. Then we can construct estimators for ̺ ( v ) with rate n − 1 / 2 . In the following we express rescalings by subscripts, f s ( x ) = f ( x/s ) /s . Let g , h denote the densities of W j , Z j . Then the density of T j − T j − 1 is a scale mixture � � ̺ ( v ) = h w ( v ) g ( w ) dw = h ( z ) g z ( v ) dz.

  3. The density h of Z j = | X j − X j − 1 | ν is calculated as follows. Let p 1 ( x ) and q ( x, y ) denote the stationary density and the transition density of the embedded chain. The conditional density at y of | X j − X j − 1 | given X j − 1 = x is � � γ ( x, y ) = q ( x, x + y ) + q ( x, x − y ) 1 ( y > 0) . Then the conditional density at y of Z j = | X j − X j − 1 | ν given X j − 1 = x is ζ ( x, y ) = 1 1 1 ν − 1 γ ( x, y ν ) . νy Hence the stationary density at y of Z j is h ( y ) = 1 1 1 � ν − 1 νy p 1 ( x ) γ ( x, y ν ) dx.

  4. A (“kernel”) estimator of the density ̺ ( v ) of the inter-arrival times T j − T j − 1 = Z j W j at v can be based on n 2 “observations” Z i W j ; this gives the local U-statistic n n ̺ ( v ) = 1 � � ˆ k b ( v − Z i W j ) n 2 i =1 j =1 with k b ( v ) = k ( v/b ) /b a kernel k scaled by a bandwidth b . Similar local U-statistics for i.i.d. observations are studied by Frees (1994) and Gin´ e and Mason (2007). These results are not appli- cable here because (a) the Z i ’s are not independent, and (b) an integrability condition fails. ̺ ( v ) has rate n − 1 / 2 Nevertheless, we show that our density estimator ˆ pointwise, but that a functional central limit theorem does not hold, in general.

  5. We apply the Hoeffding decomposition to our local U-statistic n n ̺ ( v ) = 1 � � ˆ k b ( v − Z i W j ) . n 2 i =1 j =1 The conditional mean of k b ( v − ZW ) given W is (change variables) � H ( W ) = h W ( v − bu ) k ( u ) du ; the conditional mean given Z is � G ( Z ) = g Z ( v − bu ) k ( u ) du. Hence (by Hoeffding decomposition) ˆ ̺ ( v ) has the linear approxima- tion n ̺ ( v ) − Ek b ( v − ZW ) = 1 � � � ˆ G ( Z j ) − EG ( Z ) + H ( W j ) − EH ( W ) n j =1 + o P ( n − 1 / 2 ) . The linear approximation is a smoothed empirical process .

  6. Assume that bn → ∞ and b 4 n → 0. Then the smoothing can be removed, the bias is negligible, and our local U-statistic is approxi- mated by a linear process: n ̺ ( v ) − ̺ ( v ) = 1 + o P ( n − 1 / 2 ) � � � ˆ g Z j ( v ) − ̺ ( v ) + h W j ( v ) − ̺ ( v ) n j =1 � 1 � v � v n = 1 − ̺ ( v ) + 1 � � � + o P ( n − 1 / 2 ) . � g h − ̺ ( v ) n Z j Z j W j W j j =1 Assume that the embedded chain is exponentially ergodic. Then ̺ ( v ) for the inter-arrival density has rate n − 1 / 2 and is our estimator ˆ asymptotically normal. (We can also show that ˆ ̺ ( v ) is asymptoti- cally efficient ). A functional central limit theorem usually does not hold. For exam- ple, in L 2 we need finiteness of � 1 � v � 1 �� � � � Z 2 g 2 g 2 ( v ) dv, E dv = E Z Z but E [1 /Z ] is typically infinite.

  7. A nonparametric estimator for the conditional density κ ( x, v ) at v of T j − T j − 1 given X j − 1 = x is the Nadaraya–Watson estimator � n j =1 k b ( x − X j − 1 ) k b ( v − ( T j − T j − 1 )) ˆ κ ( x, v ) = . � n j =1 k b ( x − X j − 1 ) Assume as above that Z j = | X j − X j − 1 | ν . T j − T j − 1 = Z j W j with Assume, in addition, that the embedded chain is autoregressive : X j = ϑX j − 1 + ε j with | ϑ | < 1 and ε j ’s i.i.d. with mean zero, finite variance, and positive density f . Then we can construct estimators for κ ( x, v ) with rate n − 1 / 2 . Write Z j = | X j − X j − 1 | ν = | ε j − (1 − ϑ ) X j − 1 | ν . The variables | ε j − (1 − ϑ ) x | ν are i.i.d., follow the conditional distri- bution of Z j given X j − 1 = x , and are independent of the W j ’s.

  8. Note that the variables ε j − (1 − ϑ ) x = X j − x − ϑ ( X j − 1 − x ) = ε j ( x ) , say , are innovations of the autoregressive process shifted by x . Estimate ϑ by the (say, least squares) estimator ˆ ϑ . Estimate ε j ( x ) by the residual ε j ( x ) = X j − x − ˆ ˆ ϑ ( X j − 1 − x ) . Then the conditional density of T j − T j − 1 at v given X j − 1 = x can be estimated by the local U-statistic n n κ ( x, v ) = 1 ε i ( x ) | ν W j ) � � ˆ k b ( v − | ˆ n 2 i =1 j =1 with k b ( v ) = k ( v/b ) /b a kernel k scaled by a bandwidth b . The conditional density estimator ˆ κ ( x, v ) can be shown to have rate n − 1 / 2 . Expand about ϑ first, then proceed similarly as for ˆ ̺ ( v ).

  9. Expansion of ˆ κ ( x, v ) about ϑ gives n n κ ( x, v ) = 1 ϑ − ϑ ) K + o P ( n − 1 / 2 ) k b ( v − | ε i ( x ) | ν W j ) + (ˆ � � ˆ (1) n 2 i =1 j =1 with n n K = 1 ( X i − 1 − x ) s ( ε i ( x )) W j ( k b ) ′ ( v − | ε i ( x ) | ν W j ) � � n 2 i =1 j =1 � 1 t g ′ | t | ν ( v ) f ( t + (1 − ϑ ) x ) dt in probability , → xv where s ( x ) = sign( x ) ν | x | ν − 1 . For the first right-hand term of (1), Hoeffding decomposition and unsmoothing give n n 1 k b ( v − | ε i ( x ) | ν W j ) � � n 2 i =1 j =1 n = κ ( x, v ) + 1 � � � η W j ( x, v ) − κ ( x, v ) + g | ε j ( x ) | ν ( v ) − κ ( x, v ) , n j =1 where η w ( x, v ) = η ( x, v/w ) /w .

Recommend


More recommend