estimation of the self similarity and the stability
play

Estimation of the self-similarity and the stability indices through - PowerPoint PPT Presentation

Estimation of the self-similarity and the stability indices through negative power variations Thi To Nhu DANG, jointwork with Jacques ISTAS Laboratory Jean Kunztmann University Grenoble Alpes Colloque JPS, 2016 Introduction Main results


  1. Estimation of the self-similarity and the stability indices through negative power variations Thi To Nhu DANG, jointwork with Jacques ISTAS Laboratory Jean Kunztmann University Grenoble Alpes Colloque JPS, 2016

  2. Introduction Main results Conclusion Outline Introduction 1 State of the art Preliminary Main results 2 H -sssi, S α S -stable random processes Settings and assumptions Estimation of H and α Examples H -sssi, S α S -stable random fields Settings Results and examples Multifractional stable processes Conclusion 3

  3. Introduction Main results Conclusion Outline Introduction 1 State of the art Preliminary Main results 2 H -sssi, S α S -stable random processes Settings and assumptions Estimation of H and α Examples H -sssi, S α S -stable random fields Settings Results and examples Multifractional stable processes Conclusion 3

  4. Introduction Main results Conclusion State of the art Self-similar processes are important in probability: connect to limit theorems, be of great interest in modeling, appear in geophysics, hydrology, turbulence, economics.... Stable distributions are the only distributions that can be obtained as limits of normalized sums of i.i.d random variables.

  5. Introduction Main results Conclusion State of the art Let a = ( a 0 , . . . , a K ) , K , L ∈ N such that for q = 0 , . . . , L K K � � k L +1 a k � = 0 k q a k = 0 , k =0 k =0

  6. Introduction Main results Conclusion State of the art Let a = ( a 0 , . . . , a K ) , K , L ∈ N such that for q = 0 , . . . , L K K � � k L +1 a k � = 0 k q a k = 0 , k =0 k =0 e.g K = 2 , L = 1 : ( a 0 , a 1 , a 2 ) = ( − 1 , 2 , − 1). The increments of the process X with respect to a are defined by � K a k X ( k + p △ p , n X = ) (1) n k =0

  7. Introduction Main results Conclusion State of the art Let a = ( a 0 , . . . , a K ) , K , L ∈ N such that for q = 0 , . . . , L K K � � k L +1 a k � = 0 k q a k = 0 , k =0 k =0 e.g K = 2 , L = 1 : ( a 0 , a 1 , a 2 ) = ( − 1 , 2 , − 1). The increments of the process X with respect to a are defined by � K a k X ( k + p △ p , n X = ) (1) n k =0 A usual statistical tool is the φ − variations: n − K � 1 V n ( φ, X ) = φ ( |△ p , n X | ) n − K + 1 p =0

  8. Introduction Main results Conclusion State of the art For a fBm with finite variance, generalized quadratic variations ( φ ( x ) = x 2 ) are used ([Istas1997])

  9. Introduction Main results Conclusion State of the art For a fBm with finite variance, generalized quadratic variations ( φ ( x ) = x 2 ) are used ([Istas1997]) Wavelet: the increments of the process X are replaced by wavelet coefficients ([Bardet2010], [Lacaux2007], [Cohen2013]).

  10. Introduction Main results Conclusion State of the art For a fBm with finite variance, generalized quadratic variations ( φ ( x ) = x 2 ) are used ([Istas1997]) Wavelet: the increments of the process X are replaced by wavelet coefficients ([Bardet2010], [Lacaux2007], [Cohen2013]). p -variations ( φ ( x ) = x p , 0 < p < α ) are used for fBm, for other H -sssi processes with infinite variance (e.g. α -stable processes )

  11. Introduction Main results Conclusion State of the art For a fBm with finite variance, generalized quadratic variations ( φ ( x ) = x 2 ) are used ([Istas1997]) Wavelet: the increments of the process X are replaced by wavelet coefficients ([Bardet2010], [Lacaux2007], [Cohen2013]). p -variations ( φ ( x ) = x p , 0 < p < α ) are used for fBm, for other H -sssi processes with infinite variance (e.g. α -stable processes ) Log-variations φ ( x ) = log | x | [Istas2012b] ⇒ requires the existence of logarithmic moments, rate of convergence is slow.

  12. Introduction Main results Conclusion State of the art For a fBm with finite variance, generalized quadratic variations ( φ ( x ) = x 2 ) are used ([Istas1997]) Wavelet: the increments of the process X are replaced by wavelet coefficients ([Bardet2010], [Lacaux2007], [Cohen2013]). p -variations ( φ ( x ) = x p , 0 < p < α ) are used for fBm, for other H -sssi processes with infinite variance (e.g. α -stable processes ) Log-variations φ ( x ) = log | x | [Istas2012b] ⇒ requires the existence of logarithmic moments, rate of convergence is slow. Complex variations φ ( x ) = x iM , M ∈ R [Istas2012a].

  13. Introduction Main results Conclusion State of the art For estimating α : [LeGu´ evel2013] used p -variations ( p ∈ (0 , c ) , c = min u ∈ U α ( u )) to estimate the stability functions of multistable processes

  14. Introduction Main results Conclusion State of the art For estimating α : [LeGu´ evel2013] used p -variations ( p ∈ (0 , c ) , c = min u ∈ U α ( u )) to estimate the stability functions of multistable processes � Objective: estimate both H and α , using β -variations, β ∈ ( − 1 2 , 0).

  15. Introduction Main results Conclusion Outline Introduction 1 State of the art Preliminary Main results 2 H -sssi, S α S -stable random processes Settings and assumptions Estimation of H and α Examples H -sssi, S α S -stable random fields Settings Results and examples Multifractional stable processes Conclusion 3

  16. Introduction Main results Conclusion H-sssi process A real-valued process X is H-self-similar (H-ss) if for all a > 0, ( d ) = a H { X ( t ) , t ∈ R } , { X ( at ) , t ∈ R }

  17. Introduction Main results Conclusion H-sssi process A real-valued process X is H-self-similar (H-ss) if for all a > 0, ( d ) = a H { X ( t ) , t ∈ R } , { X ( at ) , t ∈ R } has stationary increments (si) if, for all s ∈ R , ( d ) { X ( t + s ) − X ( s ) , t ∈ R } = { X ( t ) − X (0) , t ∈ R } .

  18. Introduction Main results Conclusion α -stable process A r.v X is said to have a symmetric α -stable distribution ( S α S ) if there are parameters 0 < α ≤ 2 , σ > 0 such that its characteristic function has the following form: E e i θ X = exp ( − σ α | θ | α ) We can write X ∼ S α ( σ, 0 , 0).

  19. Introduction Main results Conclusion α -stable process A r.v X is said to have a symmetric α -stable distribution ( S α S ) if there are parameters 0 < α ≤ 2 , σ > 0 such that its characteristic function has the following form: E e i θ X = exp ( − σ α | θ | α ) We can write X ∼ S α ( σ, 0 , 0). σ = 1, a S α S is said to be standard .

  20. Introduction Main results Conclusion α -stable process A r.v X is said to have a symmetric α -stable distribution ( S α S ) if there are parameters 0 < α ≤ 2 , σ > 0 such that its characteristic function has the following form: E e i θ X = exp ( − σ α | θ | α ) We can write X ∼ S α ( σ, 0 , 0). σ = 1, a S α S is said to be standard . X = ( X 1 , . . . , X n ) is a symmetric stable random vector if any linear combination of the components of X is symmetric α -stable ( α ∈ (0 , 2]).

  21. Introduction Main results Conclusion α -stable process A r.v X is said to have a symmetric α -stable distribution ( S α S ) if there are parameters 0 < α ≤ 2 , σ > 0 such that its characteristic function has the following form: E e i θ X = exp ( − σ α | θ | α ) We can write X ∼ S α ( σ, 0 , 0). σ = 1, a S α S is said to be standard . X = ( X 1 , . . . , X n ) is a symmetric stable random vector if any linear combination of the components of X is symmetric α -stable ( α ∈ (0 , 2]). { X ( t ) , t ∈ T } is symmetric stable if all of its finite-dimensional distributions are symmetric stable.

  22. Introduction Main results Conclusion Outline Introduction 1 State of the art Preliminary Main results 2 H -sssi, S α S -stable random processes Settings and assumptions Estimation of H and α Examples H -sssi, S α S -stable random fields Settings Results and examples Multifractional stable processes Conclusion 3

  23. Introduction Main results Conclusion Settings and assumptions Let X be a H − sssi , S α S random process ( α ∈ (0 , 2]) The increments of X with respect to a are defined by K � a k X ( k + p △ p , n X = ) (2) n k =0

  24. Introduction Main results Conclusion Settings and assumptions Let X be a H − sssi , S α S random process ( α ∈ (0 , 2]) The increments of X with respect to a are defined by K � a k X ( k + p △ p , n X = ) (2) n k =0 Let β ∈ R , − 1 2 < β < 0, set n − K � 1 |△ p , n X | β V n ( β ) = (3) n − K + 1 p =0 W n ( β ) = n β H V n ( β ) (4) V n / 2 ( β ) H n = 1 � β log 2 (5) V n ( β )

  25. Introduction Main results Conclusion An estimator of α Let u , v ∈ R such that 0 < v < u . g u , v : (0 , + ∞ ) → R g u , v ( x ) = u ln (Γ(1 + vx )) − v ln (Γ(1 + ux )) ,

  26. Introduction Main results Conclusion An estimator of α Let u , v ∈ R such that 0 < v < u . g u , v : (0 , + ∞ ) → R g u , v ( x ) = u ln (Γ(1 + vx )) − v ln (Γ(1 + ux )) , h u , v : (0 , + ∞ ) → ( −∞ , 0) h u , v ( x ) = g u , v (1 / x ) ,

  27. Introduction Main results Conclusion An estimator of α ψ u , v : R + × R + → R ψ u , v ( x , y ) = − v ln x + u ln y + C ( u , v ) , C ( u , v ) = u − v ln π + u ln Γ(1 + v / 2) + v ln Γ(1 − u ) 2 2 − v ln Γ(1 + u / 2) − u ln Γ(1 − v ) , 2

Recommend


More recommend