joint parameter estimation of the ornstein uhlenbeck sde
play

Joint Parameter Estimation of the Ornstein-Uhlenbeck SDE driven by - PowerPoint PPT Presentation

Joint Parameter Estimation of the Ornstein-Uhlenbeck SDE driven by Fractional Brownian Motion Luis Barboza October 23, 2012 Department of Statistics, Purdue University () Probability Seminar 1 / 59 Introduction Main Objective: To study the


  1. Joint Parameter Estimation of the Ornstein-Uhlenbeck SDE driven by Fractional Brownian Motion Luis Barboza October 23, 2012 Department of Statistics, Purdue University () Probability Seminar 1 / 59

  2. Introduction Main Objective: To study the GMM (Generalized Method of Moments) joint estimator of the drift and memory parameters of the Ornstein-Uhlenbeck SDE driven by fBm. Joint work with Prof. Frederi Viens (Department of Statistics, Purdue University). Department of Statistics, Purdue University () Probability Seminar 2 / 59

  3. Outline 1 Preliminaries 2 Joint estimation of Gaussian stationary processes 3 fOU Case 4 Simulation Department of Statistics, Purdue University () Probability Seminar 3 / 59

  4. Outline 1 Preliminaries 2 Joint estimation of Gaussian stationary processes 3 fOU Case 4 Simulation Department of Statistics, Purdue University () Probability Seminar 4 / 59

  5. Fractional Brownian Motion The fractional Brownian motion (fBm) with Hurst parameter H ∈ (0 , 1) is the centered Gaussian process B H t , continuous a.s. with: s ) = 1 2( | t | 2 H + | s | 2 H − | t − s | 2 H ) , Cov( B H t , B H t , s ∈ R . Some properties: d Self-Similarity: for each a > 0 B H = a H B H t . at It admits an integral representation with respect to the standard Brownian motion over a finite interval: � t B H t = K H ( t , s ) dW ( s ) 0 Department of Statistics, Purdue University () Probability Seminar 5 / 59

  6. Fractional Gaussian noise (fGn) Definition: N H t := B H t − B H t − α where α > 0. Gaussian and stationary process. Long-memory behavior of the increments when H > 1 2 (in the sense that � n ρ ( n ) = ∞ ). Ergodicity. Department of Statistics, Purdue University () Probability Seminar 6 / 59

  7. Fractional Gaussian noise (fGn) Autocovariance function: ρ θ ( t ) = 1 � | t + α | 2 H + | t − α | 2 H − 2 | t | 2 H � 2 Spectral density: f θ ( t ) = 2 c H (1 − cos t ) | t | − 1 − 2 H where c H = sin( π H )Γ(2 H +1) . (Beran, 1994). 2 π Department of Statistics, Purdue University () Probability Seminar 7 / 59

  8. Estimation of H (fBm and fGn) Classical methods: R / S statistic (Hurst 1951), Variance plot (Heuristic method). Robinson (1992,1995): Semiparametric Gaussian estimation. (spectral information) MLE Methods: Whittle’s estimator. Department of Statistics, Purdue University () Probability Seminar 8 / 59

  9. Estimation of H Methods using variations (filters): Coeurjolly (2001): consistent estimator of H ∈ (0 , 1) based on the asymptotic behavior of discrete variations of the fBm. Asymptotic normality for H < 3 / 4. Tudor and Viens (2008): proved consistency and asymptotics of the second-order variational estimator (convergence to Rosenblatt random variable when H > 3 / 4) using Malliavin calculus. Department of Statistics, Purdue University () Probability Seminar 9 / 59

  10. Ornstein-Uhlenbeck SDE driven by fBm Cheridito (2003) Take λ, σ > 0 and ζ a.s bounded r.v. The Langevin equation: � t X s ds + σ B H X t = ζ − λ t , t ≥ 0 0 has as an unique strong solution and it is called the fractional Ornstein-Uhlenbeck process: � t � � ζ X t = e − λ t e λ u dB H ζ + σ , t ≤ T u 0 and this integral exists in the Riemann-Stieltjes sense. Department of Statistics, Purdue University () Probability Seminar 10 / 59

  11. Properties Cheridito (2003): Stationary solution (fOU process): � t e − λ ( t − u ) dB H X t = σ u , t > 0 −∞ Autocovariance function (Pipiras and Taqqu, 2000): � ∞ cos( tx ) x 1 − 2 H ρ θ ( t ) = 2 σ 2 c H λ 2 + x 2 dx 0 where c H = Γ(2 H +1) sin( π H ) . 2 π Department of Statistics, Purdue University () Probability Seminar 11 / 59

  12. Properties Cheridito (2003): X t has long memory when H > 1 2 , due to the following approximation when x is large: ρ θ ( x ) = H (2 H − 1) x 2 H − 2 + O ( x 2 H − 4 ) . λ X t is ergodic. X t is not self-similar, but it exhibits asymptotic selfsimilarity (Bonami and Estrade, 2003): f θ ( x ) = c H | x | − 1 − 2 H + O ( | x | − 3 − 2 H ) . Department of Statistics, Purdue University () Probability Seminar 12 / 59

  13. Estimation of λ given H (OU-fBm) MLE estimators: Kleptsyna and Le Breton (2002): MLE estimator based on Girsanov formula for fBm. Strong consistency when H > 1 / 2. Tudor and Viens (2006): Extended the K&L result to more general drift conditions. Strong consistency of MLE estimator when H < 1 2 using Malliavin calculus. They work with the non-stationary case. Department of Statistics, Purdue University () Probability Seminar 13 / 59

  14. Estimation of λ given H (Least-Squares methods) Hu and Nualart (2010) Estimate of λ which is strongly consistent for H ≥ 1 2 . � − 1 � T � 1 2 H ˜ 0 X 2 λ T = t dt H Γ(2 H ) T 0 ˜ λ T is asymptotically normal if H ∈ ( 1 2 , 3 4 ) The proofs of these results rely mostly on Malliavin calculus techniques. Department of Statistics, Purdue University () Probability Seminar 14 / 59

  15. Estimation of H and λ Methods based on variations: Bierm´ e et al (2011): For fixed T , they use the results in Bierm´ e and Richard (2006) to prove consistency and asymptotic normality of the joint estimator of ( H , σ 2 ) for any stationary gaussian process with asymptotic self-similarity (Infill-asymptotics case). Brouste and Iacus (2012): For T → ∞ and α → 0 they proved consistency and asymp. normality when 1 2 < H < 3 4 for the pair ( H , σ 2 ) ( non-stationary case ). Department of Statistics, Purdue University () Probability Seminar 15 / 59

  16. Outline 1 Preliminaries 2 Joint estimation of Gaussian stationary processes 3 fOU Case 4 Simulation Department of Statistics, Purdue University () Probability Seminar 16 / 59

  17. Preliminaries X t : real-valued centered gaussian stationary process with spectral density f θ 0 ( x ). f θ ( x ): continuous function with respect to x , continuously differentiable with respect to θ . θ belongs a compact set Θ ⊂ R p . Bochner’s theorem: � ρ θ ( s ) := Cov( X t + s , X t ) = cos( sx ) f θ ( x ) dx R Department of Statistics, Purdue University () Probability Seminar 17 / 59

  18. Preliminaries If ρ θ ( s ) is a continuous function of s , then the process X t is ergodic. Assumption 1 Take α > 0 and L a positive integer. Then there exists k ∈ { 0 , 1 , . . . , L } such that ρ θ ( α k ) is an injective function of θ . Department of Statistics, Purdue University () Probability Seminar 18 / 59

  19. Preliminaries Recall: a ( l ) := ( a 0 ( l ) , . . . , a L ( l )) is a discrete filter of length L + 1 and order l for L ∈ Z + and l ∈ { 0 , . . . , L } if L � a k ( l ) k p = 0 for 0 ≤ p ≤ l − 1 k =0 L � a k ( l ) k p � = 0 if p = l . k =0 Examples: finite-difference filters, Daubechies filters (wavelets). Assume that we can choose L filters with orders l i ∈ { 1 , . . . , L } for i = 1 , . . . , L and a extra filter with l 0 = 0. Department of Statistics, Purdue University () Probability Seminar 19 / 59

  20. Preliminaries Define the filtered process of order l i and step size ∆ f > 0 at t ≥ 0 as: L � ϕ i ( X t ) := a q ( l i ) X t − ∆ f q q =0 Its expected value is: V i ( θ 0 ) := E [ ϕ i ( X t ) 2 ] = � L k =0 b k ( l i ) ρ θ 0 (∆ f k ). Define the set of moment equations by: g ( X t , θ ) := ( g 0 ( X t , θ ) , . . . , g L ( X t , θ )) ′ where g i ( X t , θ ) = ϕ i ( X t ) 2 − V i ( θ ) , for 0 ≤ i ≤ L . Department of Statistics, Purdue University () Probability Seminar 20 / 59

  21. Preliminaries Assume that we have observed the stationary process X t at times 0 = t 0 < t 1 < · · · < t N − 1 < t N = T and α := t i − t i − 1 > 0 ( fixed ). Assume there exists a sequence of symmetric positive-definite random p matrices { ˆ A N } such that ˆ A N → A , and A > 0. Define: � N 1 ˆ g N ( θ ) := i = L g ( X t i , θ ). (sample moments) N − L +1 ˆ g N ( θ ) ′ ˆ Q N ( θ ) := ˆ A N ˆ g N ( θ ). Q 0 ( θ ) := E [ g ( X t , θ )] ′ AE [ g ( X t , θ )]. Department of Statistics, Purdue University () Probability Seminar 21 / 59

  22. GMM estimation Define the GMM estimator of θ 0 : θ N : = argmin θ ∈ Θ ˆ ˆ Q N ( θ ) � T � � � N N 1 1 � � ˆ = argmin θ ∈ Θ g ( X t i , θ ) A N g ( X t i , θ ) . N N i =1 i =1 Department of Statistics, Purdue University () Probability Seminar 22 / 59

  23. Consistency Lemma 1 Under the above assumptions: (i) Q N ( θ ) − Q 0 ( θ ) | a . s | ˆ sup → 0 . θ ∈ Θ (ii) Q 0 ( θ ) = 0 . if and only if θ = θ 0 . Department of Statistics, Purdue University () Probability Seminar 23 / 59

  24. Consistency Key aspects of proof: Ergodicity of X t Continuity of ρ θ ( · ) over the compact set Θ. Injectivity of:   ρ θ ( α · 0) .  .  ρ θ ( α ) := .   ρ θ ( α · L ) Results of Newey and McFadden, 1994. (GMM) Department of Statistics, Purdue University () Probability Seminar 24 / 59

  25. Consistency We have all the conditions to apply: Theorem 1 (Newey and McFadden, 1994) Under the above assumptions, it holds that: a . s . ˆ θ N → θ 0 . Department of Statistics, Purdue University () Probability Seminar 25 / 59

Recommend


More recommend