summary
play

Summary Magnus Wiktorsson long range dependence? Stylized facts in - PowerPoint PPT Presentation

Summary Magnus Wiktorsson long range dependence? Stylized facts in returns No Autocorrelation in returns Unconditional heavy tails Gain/Loss asymmetry. Aggregational Gaussianity Volatility clustering Conditional heavy


  1. Summary Magnus Wiktorsson

  2. long range dependence? Stylized facts in returns ▶ No Autocorrelation in returns ▶ Unconditional heavy tails ▶ Gain/Loss asymmetry. ▶ Aggregational Gaussianity ▶ Volatility clustering ▶ Conditional heavy tails ▶ Significant autocorrelation for abs. returns - ▶ Leverage effects ▶ Volume/Volatility correlation ▶ Asym. in time scales

  3. Linear Gaussian models ▶ Model: X t + a 1 X t − 1 + . . . + a p X t − p = e t + c 1 e t − 1 + . . . + c q e t − q ▶ Properties ▶ Estimation (OLS/LS2/MLE) ▶ Identification and model validation

  4. Non-linear models ▶ Can generate many (too many?) new features ▶ Larger model space! ▶ Difficult to identify - use prior knowledge! ▶ Model selection via absolute tests (e.g. residuals) ▶ and relative tests (LR) or AIC/BIC

  5. Variance models ▶ Need to transform data ▶ GARCH-family ▶ GAS models ▶ Stochastic volatility ▶ Realized volatility/quadratic variation

  6. Continuous time I We restricted most of the course to continuous Semimartingales (no jumps). o calculus ▶ It ¯ ▶ Valuation using the RNVF, π t = p ( t , T ) E Q [ φ ( S T ) |F t ] . ▶ Connections to PDEs.

  7. Continuous time II Estimation of parameters for SDEs ▶ Likelihood function generally unknown. ▶ Likelihood approximations (several methods) ▶ GMM

  8. Partially observed models In continuous or discrete time IEKFs... ▶ Linear methods - Kalman filters ▶ Approximate non-linear methods - EKFs, UKFs, ▶ Monte Carlo methods - particle filters

  9. Ex: 1D Linear Model in MATLAB % Model % y=c*x + eta % x=a*x +e a=0.8; c=0.9; T=100; q=0.8; r=0.5; x=filter(1, [1 -a],sqrt(q)*randn(T,1)); y=c*x+sqrt(r)*randn(T,1); plot(1:T,x,1:T,y,’o’)

  10. Ex: Kalman filter in MATLAB % Kalman filter mf=0; Pf=1; for t=1:T % prediction mpr=a*mf; Ppr=a*Pf*a’+q; % Update KalmanGain=Ppr*c’*inv(c*Ppr*c’+r); mf=mpr+KalmanGain*(y(t)-c*mpr); Pf=(eye(size(Pf))-KalmanGain*c)*Ppr; E_KF(t)=mf; end plot(1:T,E_KF,’kp’)

  11. Ex: Particle filter in MATLAB % Particle filter K=10; % initialize xif=randn(K,1); for t=1:T % prediction xipr=a*xif+sqrt(q)*randn(K,1); % update lambdat=normpdf(y(t)*ones(K,1),c*xipr,sqrt(r)); % resampling I=randsample(1:K,K,’true’,lambdat); xif=xipr(I); E_PF(t)=mean(xif); end plot(1:T,E_PF,’g–’)

Recommend


More recommend