one step studentized m estimator
play

One Step Studentized M -estimator M -Estimator Marek Omelka - PowerPoint PPT Presentation

EYSM 2005 Marek Omelka One Step Studentized One Step Studentized M -estimator M -Estimator Marek Omelka Department of Probability and Mathematical Statistics, Charles University in Prague omelka@karlin.mff.cuni.cz EYSM 2005


  1. EYSM 2005 Marek Omelka One Step Studentized One Step Studentized M -estimator M -Estimator Marek Omelka Department of Probability and Mathematical Statistics, Charles University in Prague omelka@karlin.mff.cuni.cz ◭◭ EYSM 2005 ◮◮ August 22–26, 2005, Debrecen ◭ ◮ Back Close

  2. EYSM 2005 Marek Omelka Outline 1. Motivation One Step Studentized 2. Consistency of the one step estimator M -Estimator 3. Hadamard differentiability and asymptotic normality 4. Bootstrapping 5. Numerical results ◭◭ ◮◮ ◭ ◮ Back Close

  3. EYSM 2005 Marek Omelka Motivation Let X 1 , . . . , X n be independent random variables, identically dis- tributed according to a distribution function F . In the robust setting, we usually suppose that F ∈ H ε ( F 0 ) , where One Step Studentized M -Estimator H ε ( F 0 ) = { F : F ( x ) = (1 − ε ) F 0 ( x − µ σ ) + εH ( x ) } . Our aim is to estimate the location parameter µ . ◭◭ ◮◮ ◭ ◮ Back Close

  4. EYSM 2005 Marek Omelka Motivation Let X 1 , . . . , X n be independent random variables, identically dis- tributed according to a distribution function F . In the robust setting, we usually suppose that F ∈ H ε ( F 0 ) , where One Step Studentized M -Estimator H ε ( F 0 ) = { F : F ( x ) = (1 − ε ) F 0 ( x − µ σ ) + εH ( x ) } . Our aim is to estimate the location parameter µ . The studentized M -estimator is defined as n � ρ ( X i − t M n = arg min S n ) , ◭◭ t ∈ R i =1 ◮◮ where S n is an appropriate estimate of the (nuisance) scale parameter. ◭ In the following, we will suppose that the function ρ is symmetric. ◮ Back Close

  5. EYSM 2005 Marek Omelka Motivation We usually find the M-estimator as the root of the equation n � � One Step � X i − M n = 0 , (1) ψ Studentized M -Estimator S n i =1 where ψ = ρ ′ . ◭◭ ◮◮ ◭ ◮ Back Close

  6. EYSM 2005 Marek Omelka Motivation We usually find the M-estimator as the root of the equation n One Step � � � X i − M n Studentized ψ = 0 , (1) M -Estimator S n i =1 where ψ = ρ ′ . As a quick approximation to the solution of (1) the so called one step estimator is used, which is defined as � � X i − M (0) � n 1 i =1 ψ n n S n M (1) = M (0) n + S n (2) � , ◭◭ n � X i − M (0) � n 1 i =1 ψ ′ n ◮◮ n S n ◭ where M (0) is an initial (scale equivariant) estimate of location. ◮ n Back Close

  7. EYSM 2005 Marek Omelka Motivation Reasons for one-step estimator M (1) n • easy to compute One Step Studentized • simulation studies show good properties (e.g. Andrews et al. (1972)) M -Estimator • in the case of symmetric distribution function F – the same asymp- totic efficiency as M n • at least asymptotically it solves the problem of multiple roots of the defining equation for M -estimator • it usually has a lower bias than M n (e.g. Rousseeuw and Croux (1994)) ◭◭ ◮◮ ◭ ◮ Back Close

  8. EYSM 2005 Marek Omelka Motivation One Step Studentized M -Estimator ◭◭ ◮◮ ◭ ◮ One step with proper length in the right direction is often enough. Back Close

  9. EYSM 2005 Marek Omelka Consistency To study asymptotic properties of the one-step estimator it is conve- nient to look at the estimator as the statistical functional. One Step Studentized T : D ( F ) �→ R M -Estimator ◭◭ ◮◮ ◭ ◮ Back Close

  10. EYSM 2005 Marek Omelka Consistency To study asymptotic properties of the one-step estimator it is conve- nient to look at the estimator as the statistical functional. Some examples of statistical functionals : One Step Studentized M -Estimator • Median – med( F ) = F − 1 ( 1 2 ) • MAD – MAD( F ) = inf { t : F ( F − 1 ( 1 2 ) + t ) − F ( F − 1 ( 1 2 ) − t ) > 1 2 } � � X 1 − M ( F ) • studentized M -estimator – E F ψ = 0 S ( F ) ◭◭ ◮◮ ◭ ◮ Back Close

  11. EYSM 2005 Marek Omelka Consistency To study asymptotic properties of the one-step estimator it is conve- nient to look at the estimator as the statistical functional. Some examples of statistical functionals : One Step Studentized M -Estimator • Median – med( F ) = F − 1 ( 1 2 ) • MAD – MAD( F ) = inf { t : F ( F − 1 ( 1 2 ) + t ) − F ( F − 1 ( 1 2 ) − t ) > 1 2 } � � X 1 − M ( F ) • studentized M -estimator – E F ψ = 0 S ( F ) The statistical functional for the one-step estimator is � � X 1 − T 0 ( F ) E F ψ S ( F ) T 1 ( F ) = T 0 ( F ) + � . ◭◭ � X 1 − T 0 ( F ) E F ψ ′ ◮◮ S ( F ) ◭ Note. If F is symmetric, then usually T 0 ( F ) = M ( F ) and so also T 1 ( F ) = M ( F ) . ◮ Back Close

  12. EYSM 2005 Marek Omelka Consistency By consistency of the one-step estimator we will understand that T 1 ( F n ) − n →∞ T 1 ( F ) − − → [ F ] − a.s. . One Step Studentized M -Estimator ◭◭ ◮◮ ◭ ◮ Back Close

  13. EYSM 2005 Marek Omelka Consistency By consistency of the one-step estimator we will understand that T 1 ( F n ) − n →∞ T 1 ( F ) − − → [ F ] − a.s. . One Step Studentized M -Estimator As � X i − T 0 ( F n ) � � n 1 i =1 ψ n S ( F n ) T 1 ( F n ) = T 0 ( F n ) + S ( F n ) � , � X i − T 0 ( F n ) � n 1 i =1 ψ ′ n S ( F n ) for consistency of T 1 ( F n ) we need at least consistency of the estimators T 0 ( F n ) and S ( F n ) , and also continuity of the functions ◭◭ λ F ( s, t ) = E F ψ ( X 1 − t λ ′ F ( s, t ) = E F ψ ′ ( X 1 − t s ) , s ) ◮◮ ◭ at the point ( S ( F ) , T 0 ( F )) . ◮ Back Close

  14. EYSM 2005 Marek Omelka Consistency - artificial example A very popular ψ -function is the Huber function defined as 1.5 One Step  − k, x < − k 1.0 Studentized 0.5  M -Estimator 0.0 ψ k ( x ) = x, | x | ≤ k −0.5 −1.0 k, x > k  −1.5 −4 −2 0 2 4 The problem of this function is its nondifferentiability in the points ± k . ◭◭ ◮◮ ◭ ◮ Back Close

  15. EYSM 2005 Marek Omelka Consistency - artificial example A very popular ψ -function is the Huber function defined as 1.5 One Step  − k, x < − k 1.0 Studentized 0.5  M -Estimator 0.0 ψ k ( x ) = x, | x | ≤ k −0.5 −1.0 k, x > k  −1.5 −4 −2 0 2 4 The problem of this function is its nondifferentiability in the points ± k . Let the distribution function be F ( x ) = 0 . 7 Φ( x ) + 0 . 3 δ 2 , where Φ is a d.f. of N (0 , 1) and δ 2 is Dirac measure at the point 2 . Let the initial estimate ( . be the median – that is T 0 ( F ) = Φ − 1 � � = 0 . 566) , and for simplicity 1 ◭◭ 2(1 − 0 . 3) we will fix the scale as S ( F ) = 1 . Then for k = 2 − T 0 ( F ) the function ◮◮ k ( X 1 − t ) is not continuous at the point T 0 ( F ) , which causes λ ′ F ( t ) = E F ψ ′ ◭ inconsistency of one-step estimator. ◮ Back Close

  16. EYSM 2005 Marek Omelka Consistency - artificial example The function λ ′ F ( t ) = E F ψ ′ k ( X 1 − t ) One Step Studentized 1.0 M -Estimator 0.9 0.8 0.7 0.6 0.5 0.0 0.2 0.4 0.6 0.8 1.0 t ◭◭ ◮◮ ◭ ◮ Back Close

  17. EYSM 2005 Marek Omelka Consistency - artificial example The function λ ′ F ( t ) = E F ψ ′ k ( X 1 − t ) One Step Studentized 1.0 M -Estimator 0.9 0.8 0.7 0.6 0.5 0.0 0.2 0.4 0.6 0.8 1.0 t ◭◭ ◮◮ A simple calculation shows that the one-step estimator should oscillate ◭ between the values 0.69 and 0.75. ◮ Back Close

  18. EYSM 2005 Marek Omelka Consistency - artificial example One step estimator 12 One Step Studentized M -Estimator 10 8 6 4 2 0 ◭◭ 0.5 0.6 0.7 0.8 ◮◮ n = 1 000 ◭ ◮ Back Close

  19. EYSM 2005 Marek Omelka Consistency - artificial example One step estimator 20 One Step Studentized M -Estimator 15 10 5 0 ◭◭ 0.66 0.68 0.70 0.72 0.74 0.76 0.78 0.80 ◮◮ n = 10 000 ◭ ◮ Back Close

  20. EYSM 2005 Marek Omelka Consistency - artificial example One step estimator 20 One Step Studentized M -Estimator 15 10 5 0 ◭◭ 0.66 0.68 0.70 0.72 0.74 0.76 0.78 0.80 ◮◮ n = 10 000 ◭ ◮ Back Close

  21. EYSM 2005 Marek Omelka Asymptotic normality It is quite well known that if the d.f. F is symmetric, then the one-step estimator T 1 ( F n ) has under some mild conditions the same asymptotic distribution as (fully iterated) estimator M ( F n ) . One Step Studentized Under a little bit more restrictive conditions we can even prove that √ n ( T 1 ( F n ) − M ( F n )) = M -Estimator O P ( n − 1 / 2 ) . ◭◭ ◮◮ ◭ ◮ Back Close

  22. EYSM 2005 Marek Omelka Asymptotic normality It is quite well known that if the d.f. F is symmetric, then the one-step estimator T 1 ( F n ) has under some mild conditions the same asymptotic distribution as (fully iterated) estimator M ( F n ) . One Step Studentized Under a little bit more restrictive conditions we can even prove that √ n ( T 1 ( F n ) − M ( F n )) = M -Estimator O P ( n − 1 / 2 ) . To prove the asymptotic normality and some other asymptotic results we will show that the functional � � X 1 − T 0 ( G ) E G ψ S ( G ) T 1 ( G ) = T 0 ( G ) + S ( G ) � . � X 1 − T 0 ( G ) E G ψ ′ S ( G ) ◭◭ ◮◮ is Hadamard differentiable at the point F . ◭ ◮ Back Close

Recommend


More recommend