lecture 8
play

Lecture 8 ARMA Models Colin Rundel 02/13/2017 1 AR(p) 2 AR(p) - PowerPoint PPT Presentation

Lecture 8 ARMA Models Colin Rundel 02/13/2017 1 AR(p) 2 AR(p) From last time, p 1. Expected value? 2. Covariance / correlation? 3. Stationarity? 3 AR ( p ) : y t = + 1 y t 1 + 2 y t 2 + + p y t p + w t


  1. Lecture 8 ARMA Models Colin Rundel 02/13/2017 1

  2. AR(p) 2

  3. AR(p) From last time, p 1. Expected value? 2. Covariance / correlation? 3. Stationarity? 3 AR ( p ) : y t = δ + ϕ 1 y t − 1 + ϕ 2 y t − 2 + · · · + ϕ p y t − p + w t ∑ = δ + w t + ϕ i y t − i i = 1 What are the properities of AR ( p ) ,

  4. L k y t Lag operator 1 k y t therefore, 2 y t L y t The lag operator is convenience notation for writing out AR (and other) time L L y t L 2 y t this can be generalized where, We define the lag operator L as follows, series models. 4 L y t = y t − 1

  5. Lag operator The lag operator is convenience notation for writing out AR (and other) time series models. We define the lag operator L as follows, this can be generalized where, therefore, 4 L y t = y t − 1 L 2 y t = L L y t = L y t − 1 = y t − 2 L k y t = y t − k

  6. p L 1 L 2 L 2 p L p This polynomial of the lags Lag polynomial 1 is called the lag or characteristic polynomial of the AR process. 5 An AR ( p ) model can be rewitten as y t = δ + ϕ 1 y t − 1 + ϕ 2 y t − 2 + · · · + ϕ p y t − p + w t y t = δ + ϕ 1 L y t + ϕ 2 L 2 y t + · · · + ϕ p L p y t + w t y t − ϕ 1 L y t − ϕ 2 L 2 y t − · · · − ϕ p L p y t = δ + w t ( 1 − ϕ 1 L − ϕ 2 L 2 − · · · − ϕ p L p ) y t = δ + w t

  7. This polynomial of the lags Lag polynomial is called the lag or characteristic polynomial of the AR process. 5 An AR ( p ) model can be rewitten as y t = δ + ϕ 1 y t − 1 + ϕ 2 y t − 2 + · · · + ϕ p y t − p + w t y t = δ + ϕ 1 L y t + ϕ 2 L 2 y t + · · · + ϕ p L p y t + w t y t − ϕ 1 L y t − ϕ 2 L 2 y t − · · · − ϕ p L p y t = δ + w t ( 1 − ϕ 1 L − ϕ 2 L 2 − · · · − ϕ p L p ) y t = δ + w t ϕ p ( L ) = ( 1 − ϕ 1 L − ϕ 2 L 2 − · · · − ϕ p L p )

  8. lay outside the complex unit circle Example AR(1): 6 Stationarity of AR ( p ) processes An AR ( p ) process is stationary if the roots of the characteristic polynomial

  9. lay outside the complex unit circle Example AR(1): 6 Stationarity of AR ( p ) processes An AR ( p ) process is stationary if the roots of the characteristic polynomial

  10. Example AR(2) 7

  11. AR(2) Stationarity Conditions From http://www.sfu.ca/~baa7/Teaching/econ818/StationarityAR2.pdf 8

  12. Proof . . . . . . . . . . . 0 0 0 1 0 0 0 0 . . 1 0 . . . 0 . . . 0 w t . . . . 0 1 0 0 1 0 . 9 . . . 0 0 0 . . y t where . We can rewrite the AR ( p ) model into an AR ( 1 ) form using matrix notation y t = δ + ϕ 1 y t − 1 + ϕ 2 y t − 2 + · · · + ϕ p y t − p + w t ξ t = δ + F ξ t − 1 + w t           δ ϕ 1 ϕ 2 ϕ 3 · · · ϕ p − 1 ϕ p y t − 1 · · ·           y t − 1 y t − 2           · · ·           = + + y t − 2 y t − 3                     · · ·                     · · · y t − p + 1 y t − p   δ + w t + ∑ p i = 1 ϕ i y t − i   y t − 1     = y t − 2         y t − p + 1

  13. Proof sketch (cont.) t and therefore we need lim t 10 equation So just like the original AR ( 1 ) we can expand out the autoregressive ξ t = δ + w t + F ξ t − 1 = δ + w t + F ( δ + w t − 1 ) + F 2 ( δ + w t − 2 ) + · · · + F t − 1 ( δ + w 1 ) + F t ( δ + w 0 ) F i + ∑ ∑ = δ F i w t − i i = 0 i = 0 t →∞ F t → 0.

  14. F i w t 1 w t i i i Q Q 0 i t 1 i Q Q 0 i t Proof sketch (cont.) 0 i t F i 0 i t t Using this property we can rewrite our equation from the previous slide as A useful property of the eigen decomposition is that corresponding eigenvalues. 11 We can find the eigen decomposition such that F = Q Λ Q − 1 where the columns of Q are the eigenvectors of F and Λ is a diagonal matrix of the F i = Q Λ i Q − 1

  15. Proof sketch (cont.) t t t t 11 Using this property we can rewrite our equation from the previous slide as corresponding eigenvalues. A useful property of the eigen decomposition is that We can find the eigen decomposition such that F = Q Λ Q − 1 where the columns of Q are the eigenvectors of F and Λ is a diagonal matrix of the F i = Q Λ i Q − 1 F i + ∑ ∑ ξ t = δ F i w t − i i = 0 i = 0 Q Λ i Q − 1 + ∑ ∑ Q Λ i Q − 1 w t − i = δ i = 0 i = 0

  16. Proof sketch (cont.) 0 . . . . . . 0 . p Therefore, lim when lim which requires that for all i . ... . 0 0 1 0 12 0 2   λ i · · · λ i · · ·   Λ i =         · · · λ i t →∞ F t → 0 t →∞ Λ t → 0 | λ i | < 1

  17. p where L 1 L 2 L 2 p 1 L p p L p Proof sketch (cont.) 0 1 gives 1 1 which if we multiply by 1 based on our definition of F our eigenvalues will therefore be the roots of 13 Eigenvalues are defined such that for λ , det ( F − λ I ) = 0 λ p − ϕ 1 λ p − 1 − ϕ 2 λ p − 2 − · · · − ϕ p 1 λ 1 − ϕ p = 0

  18. Proof sketch (cont.) based on our definition of F our eigenvalues will therefore be the roots of 13 Eigenvalues are defined such that for λ , det ( F − λ I ) = 0 λ p − ϕ 1 λ p − 1 − ϕ 2 λ p − 2 − · · · − ϕ p 1 λ 1 − ϕ p = 0 which if we multiply by 1 /λ p where L = 1 /λ gives 1 − ϕ 1 L − ϕ 2 L 2 − · · · − ϕ p 1 L p − 1 − ϕ p L p = 0

  19. w w 14 Properties of AR ( p ) For a stationary AR ( p ) process where w t has E ( w t ) = 0 and Var ( w t ) = σ 2 δ E ( Y t ) = 1 − ϕ 1 − ϕ 2 − · · · − ϕ p Var ( Y t ) = γ 0 = ϕ 1 γ 1 + ϕ 2 γ 2 + · · · + ϕ p γ p + σ 2 Cov ( Y t , Y t − j ) = γ j = ϕ 1 γ j − 1 + ϕ 2 γ j − 2 + · · · + ϕ p γ j − p Corr ( Y t , Y t − j ) = ρ j = ϕ 1 ρ j − 1 + ϕ 2 ρ j − 2 + · · · + ϕ p ρ j − p

  20. Moving Average (MA) Processes 15

  21. MA(1) A moving average process is similar to an AR process, except that the autoregression is on the error term. Properties: 16 MA ( 1 ) : y t = δ + w t + θ w t − 1

  22. Time series 17

  23. ACF 18 ACF ACF 0.0 0.2 0.4 0.6 0.8 1.0 0.0 0.2 0.4 0.6 0.8 1.0 0 0 2 2 4 4 θ =−0.1 θ =0.1 Lag Lag 6 6 8 8 10 10 ACF ACF −0.2 0.2 0.4 0.6 0.8 1.0 −0.5 0.0 0.5 1.0 0 0 2 2 θ =−0.8 4 θ =0.8 4 Lag Lag 6 6 8 8 10 10 ACF ACF 0.0 0.2 0.4 0.6 0.8 1.0 −0.2 0.2 0.4 0.6 0.8 1.0 0 0 2 2 θ =−2.0 4 4 θ =2.0 Lag Lag 6 6 8 8 10 10

  24. MA(q) Properties: 19 MA ( q ) : y t = δ + w t + θ 1 w t − 1 + θ 2 w t − 2 + · · · + θ q w t − q

  25. Time series 20

  26. ACF 21 θ ={−1.5} θ ={−1.5, −1} 1.0 1.0 0.5 0.6 ACF ACF 0.0 0.2 −0.5 −0.2 0 2 4 6 8 10 0 2 4 6 8 10 Lag Lag θ ={−1.5, −1, 2} θ ={−1.5, −1, 2, 3} 1.0 1.0 0.5 0.5 ACF ACF 0.0 0.0 −0.5 −0.5 0 2 4 6 8 10 0 2 4 6 8 10 Lag Lag

  27. ARMA Model 22

  28. ARMA Model An ARMA model is a composite of AR and MA processes, Since all MA processes are stationary, we only need to examine the AR circle). 23 ARMA ( p , q ) : y t = δ + ϕ 1 y t − 1 + · · · ϕ p y t − p + w t + θ 1 w t − 1 + · · · + θ q w t q ϕ p ( L ) y t = δ + θ q ( L ) w t aspect to determine stationarity (roots of ϕ p ( L ) lie outside the complex unit

  29. Time series 24

  30. ACF 25 φ ={0.9}, θ ={−} φ ={−0.9}, θ ={−} φ ={−}, θ ={0.9} φ ={−}, θ ={−0.9} 1.0 1.0 1.0 1.0 0.5 0.6 0.6 0.5 ACF ACF ACF ACF 0.0 0.2 0.2 0.0 −0.5 −0.2 −0.2 −0.5 0 2 4 6 8 10 0 2 4 6 8 10 0 2 4 6 8 10 0 2 4 6 8 10 Lag Lag Lag Lag φ ={0.9}, θ ={0.9} φ ={−0.9}, θ ={0.9} φ ={0.9}, θ ={−0.9} φ ={−0.9}, θ ={−0.9} 1.0 1.0 1.0 1.0 0.6 0.6 0.6 0.6 ACF ACF ACF ACF 0.2 0.2 0.2 0.2 −0.2 −0.2 −0.2 −0.2 0 2 4 6 8 10 0 2 4 6 8 10 0 2 4 6 8 10 0 2 4 6 8 10 Lag Lag Lag Lag

Recommend


More recommend