vector autoregressive modelling
play

Vector Autoregressive Modelling A brief introduction to vars package - PowerPoint PPT Presentation

Vector Autoregressive Modelling A brief introduction to vars package Jilber Urbina R User Group Barcelona May 2012. Jilber Urbina (RUG BCN) vars Package May 2012. 1 / 33 Outline Introduction to VAR models 1 Estimation 2 Restricted VARs


  1. Vector Autoregressive Modelling A brief introduction to vars package Jilber Urbina R User Group Barcelona May 2012. Jilber Urbina (RUG BCN) vars Package May 2012. 1 / 33

  2. Outline Introduction to VAR models 1 Estimation 2 Restricted VARs 3 Diagnostic testing 4 Causality Analysis 5 Forecasting 6 Impulse Response Function 7 Forecast Error Variance Decomposition 8 This presentation is entirely based on « Using the vars package » downloadable from http://rss.acs.unt.edu/Rdoc/library/vars/doc/vars.pdf Jilber Urbina (RUG BCN) vars Package May 2012. 2 / 33

  3. Introduction VAR stands for Vector Autoregressive which is a natural extension to univariate Autoregressive models (AR). In its basic form, a VAR consists of a set of K endogenous variables y t = ( y 1 t , . . . , y kt , . . . , y Kt ) for k = 1 , . . . , K . The VAR(p)-process is then de…ned as: p ∑ y t = A 1 y t � 1 + � � � + A p y t � p + u t = A i y t � i + u t i = 1 where A i are ( K � K ) coe¢cients matrices for i = 1 , . . . , p and u t is a K � dimensional white noise process with time invariant positive de…nite covariance matrix E ( u t u 0 t ) = Σ u . A VAR process is said to be stable if and only if all its characteristic roots lie inside the complex unit circle, such roots are obtained by: det ( I K � A 1 z � . . . � A p z p ) 6 = 0 for j z j � 1 . where z are the eigenvalues. Jilber Urbina (RUG BCN) vars Package May 2012. 3 / 33

  4. Data Before estimating the VAR model, we are going to say some words about the data set we are going to work with. Data set: Canada consists of canadian macroeconomic time series published by the OECD. The sample range is from the 1stQ 1980 until 4thQ 2000. Variables: e : Employment prod : Productivity rw : Real Wage U : Unemployment Rate. Jilber Urbina (RUG BCN) vars Package May 2012. 4 / 33

  5. Data Canada is available from vars package In order to access the data we have to execute the following lines: install.packages(’vars’) library(vars) data(Canada) ?Canada Jilber Urbina (RUG BCN) vars Package May 2012. 5 / 33

  6. Estimation Empirically speaking the …rst step when dealing with VARs is to determine the lag length ( p ) to be included in the estimation process. This is a decision making problem where its solution is based on some information criteria. An optimal lag-order can be determined according to an information criterion or the …nal prediction error of a VAR ( p ) with the function VARselect() . > args(VARselect) function (y, lag.max = 10, type = c("const", "trend", "both","none")) Jilber Urbina (RUG BCN) vars Package May 2012. 6 / 33

  7. Lag length selection. > VARselect(Canada, lag.max = 5, type = "const") $selection AIC(n) HQ(n) SC(n) FPE(n) 3 2 2 3 $criteria 1 2 3 4 5 AIC(n) -5.919117819 -6.45220283 -6.499021907 -6.247207996 -6.027766024 HQ(n) -5.726859935 -6.06768706 -5.922248254 -5.478176460 -5.066476603 SC(n) -5.439229647 -5.49242648 -5.059357389 -4.327655306 -3.628325161 FPE(n) 0.002976003 0.00175206 0.001685528 0.002201523 0.002811116 We’ll work with 2 lags. Jilber Urbina (RUG BCN) vars Package May 2012. 7 / 33

  8. Estimation Now we’re ready for estimating the 4-variables second order VAR, VAR ( 2 ) . The VAR() function allows us to estimate the model, its arguments are: > args(VAR) function (y, p = 1, type = c("const", "trend", "both", "none")) The VAR(2) is estimated with the function VAR() and as deterministic regressors a constant is included. > var.2c < - VAR(Canada, p = 2, type = "const") Jilber Urbina (RUG BCN) vars Package May 2012. 8 / 33

  9. Summary The function summary returns a list object of class varest and the list elements are explained in detail in the function’s help …le. Let us now focus on two methods, namely summary and plot . The summary method simply applies the summary.lm method to the lm objects contained in varresult. > summary(var.2c) VAR Estimation Results: ========================= Endogenous variables: e, prod, rw, U Deterministic variables: const Sample size: 82 Log Likelihood: -175.819 Roots of the characteristic polynomial: 0.995 0.9081 0.9081 0.7381 0.7381 0.1856 0.1429 0.1429 Call: VAR(y = Canada, p = 2, type = "const") Jilber Urbina (RUG BCN) vars Package May 2012. 9 / 33

  10. Estimation: Summary Estimation results for equation prod: ===================================== prod = e.l1 + prod.l1 + rw.l1 + U.l1 + e.l2 + prod.l2 + rw.l2 + U.l2 + const Pr( > j t j ) Estimate Std. Error t value e.l1 -0.17277 0.26977 -0.640 0.52390 prod.l1 1.15043 0.10995 10.464 3.57e-16 *** rw.l1 0.05130 0.09934 0.516 0.60710 U.l1 -0.47850 0.36470 -1.312 0.19362 e.l2 0.38526 0.25688 1.343 0.18346 prod.l2 -0.17241 0.11881 -1.451 0.15104 rw.l2 -0.11885 0.09985 -1.190 0.23778 U.l2 1.01592 0.37285 2.725 0.00808 *** const -166.77552 100.43388 -1.661 0.10109 Residual standard error: 0.6525 on 73 degrees of freedom Multiple R-Squared: 0.9787, Adjusted R-squared: 0.9764 F-statistic: 419.3 on 8 and 73 DF, p-value: < 2.2e-16 Jilber Urbina (RUG BCN) vars Package May 2012. 10 / 33

  11. Plot plot(var.2c) Jilber Urbina (RUG BCN) vars Package May 2012. 11 / 33

  12. roots(): VAR stability As it was said before, a VAR model is stable (stationary) if and only if the roots of its characteristic polynomial are all equal or less than 1. > summary(var.2c)$roots > roots(var.2c) [1] 0.9950338 0.9081062 0.9081062 0.7380565 0.7380565 0.1856381 0.1428889 [8] 0.1428889 > all(roots(var.2c) < =1) [1] TRUE Although, the …rst eigenvalue is pretty close to unity, for the sake of simplicity, we assume a stable VAR(2)-process with a constant as deterministic regressor. Therefore, the estimated VAR model is stable, hence it is stationary. Jilber Urbina (RUG BCN) vars Package May 2012. 12 / 33

  13. restrict(): Restricted VARs It is obvious that not all regressors enter signi…cantly. With the function restrict() the user has the option to re-estimate the VAR either by signi…cance (argument method = "ser") or by imposing zero restrictions manually (argument method = "manual" ). In the former case, each equation is re-estimated separately as long as there are t-values that are in absolut value below the threshhold value set by the function’s argument thresh . In the latter case, a restriction matrix has to be provided that consists of 0/1 values, thereby selecting the coe¢cients to be retained in the model. The function’s arguments are therefore: > args(restrict) function (x, method = c("ser", "manual"), thresh = 2, resmat = NULL) Jilber Urbina (RUG BCN) vars Package May 2012. 13 / 33

  14. restrict(): Restricted VARs Let’s re-estimate the model leaving out all non-signi…cant coe¢cients at 5% level. Jilber Urbina (RUG BCN) vars Package May 2012. 14 / 33

  15. B(): Coe¢cients of the restricted VAR Jilber Urbina (RUG BCN) vars Package May 2012. 15 / 33

  16. Normality Jarque-Bera test for univariate case: � � JB = n s 2 + 1 4 ( k � 3 ) 2 � χ 2 2 6 where s is the skewness, k stands for kurtosis, and n is the sample size. Jarque-Bera test for multivariate case: JB mv = s 2 3 + s 2 4 � χ 2 2 K where s 2 3 and s 2 4 are computed according to: T b 0 3 b 3 s 2 = 3 6 T s 2 24 ( b 4 � 3 K ) 0 ( b 4 � 3 K ) = 4 with b 3 and b 4 are the third and fourth non-central moment vectors of the _ u s t = ˜ u t ) where ˜ standardized residuals ^ P � ( ˆ u t � ^ P is the lower triangular Jilber Urbina (RUG BCN) vars Package May 2012. 16 / 33 choleski matrix.

  17. normality(): Jarque-Bera test for normality. Univariate In package ‘vars’ the functions for diagnostic testing are normality.test() , serial() and stability() . > args(normality.test) function (x, multivariate.only = TRUE) > normality.test(var.2c, multivariate.only = TRUE) $e JB-Test (univariate) data: Residual of e equation Chi-squared = 0.1535, df = 2, p-value = 0.9261 $prod JB-Test (univariate) data: Residual of prod equation Chi-squared = 4.2651, df = 2, p-value = 0.1185 Jilber Urbina (RUG BCN) vars Package May 2012. 17 / 33

  18. normality(): Jarque-Bera test for normality. Univariate $rw JB-Test (univariate) data: Residual of rw equation Chi-squared = 0.3348, df = 2, p-value = 0.8459 $U JB-Test (univariate) data: Residual of U equation Chi-squared = 0.5664, df = 2, p-value = 0.7534 Jilber Urbina (RUG BCN) vars Package May 2012. 18 / 33

  19. normality(): Jarque-Bera test for normality. Multivariate > normality.test(var.2c, multivariate.only = TRUE) $JB JB-Test (multivariate) data: Residuals of VAR object var.2c Chi-squared = 5.094, df = 8, p-value = 0.7475 Jilber Urbina (RUG BCN) vars Package May 2012. 19 / 33

  20. Serial autocorrelation To test for multivariate serial autocorrelation I will use the Breusch-Godfrey LM-statistic which is based upon the following auxiliary regressions: ^ u t = A 1 y t � 1 + . . . + A p y t � p + B 1 ^ u t � 1 + . . . + B h ^ u t � h + ε t where the null hypothesis is: H 0 : B 1 = . . . = B h = 0 H 1 : 9 B i 6 = 0 for i = 1 , 2 , . . . , h . The test statistic is de…ned as: LM h = T ( K � tr ( ˜ Σ � 1 R ˜ Σ e )) � χ 2 ( hK 2 ) where ˜ Σ R and ˜ Σ e assign the residual covariance matrix of the restricted and unrestricted model, respectively. Jilber Urbina (RUG BCN) vars Package May 2012. 20 / 33

Recommend


More recommend