lecture 4 non linear time series
play

Lecture 4, Non-linear Time Series Erik Lindstrm Its not a bug, its - PowerPoint PPT Presentation

Lecture 4, Non-linear Time Series Erik Lindstrm Its not a bug, its a feature! Properties of non-linear systems. Limit cycles Jumps Non-symmetric distributions Bifurcations Chaos Non-linear dependence Why are we using


  1. Lecture 4, Non-linear Time Series Erik Lindström

  2. “It’s not a bug, it’s a feature!” Properties of non-linear systems. Limit cycles Jumps Non-symmetric distributions Bifurcations Chaos Non-linear dependence ◮ Why are we using linear models? ◮ Properties ◮ Limitations

  3. “It’s not a bug, it’s a feature!” ◮ Why are we using linear models? ◮ Properties ◮ Limitations ◮ Properties of non-linear systems. ◮ Limit cycles ◮ Jumps ◮ Non-symmetric distributions ◮ Bifurcations ◮ Chaos ◮ Non-linear dependence

  4. 0 m General properties k 0 0 l k kl 0 0 l k 0 k functions well-behaved, then there exists a sequence of bounded Suppose that f is sufficiently klm ◮ Assume causal system f ( Y n , Y n − 1 , . . . , Y 1 ) = ε n ◮ Invertable system Y n = f ⋆ ( ε n , . . . , ε 1 ) ◮ Volterra series.

  5. well-behaved, then there exists a sequence of bounded General properties functions ◮ Assume causal system f ( Y n , Y n − 1 , . . . , Y 1 ) = ε n ◮ Invertable system Y n = f ⋆ ( ε n , . . . , ε 1 ) ◮ Volterra series. Suppose that f ∗ is sufficiently ∞ ∞ ∞ ∞ ∞ ∞ ∑ ∑ ∑ ∑ ∑ ∑ | ψ k | < ∞ , | ψ kl | < ∞ , | ψ klm | < ∞ , ... k = 0 k = 0 l = 0 k = 0 l = 0 m = 0

  6. l t 0 m k t l k 0 l 0 klm t Volterra series kl t m (2) This results in generalized transfer functions. NOTE that superposition is lost! These transfer functions does not care if is k t 0 where k (1) 0 l Y t Approximate the general model by 0 k t k k deterministic of stochastic! µ = f ∗ ( 0 ) , ψ k = ( ∂ f ∗ ∂ 2 f ∗ ) , ψ kl = ( ) , ... ∂ϵ t − k ∂ϵ t − k ∂ϵ t − l

  7. Volterra series Y t is These transfer functions does not care if superposition is lost! NOTE that This results in generalized transfer functions. (2) where deterministic of stochastic! Approximate the general model by (1) µ = f ∗ ( 0 ) , ψ k = ( ∂ f ∗ ∂ 2 f ∗ ) , ψ kl = ( ) , ... ∂ϵ t − k ∂ϵ t − k ∂ϵ t − l ∞ ∞ ∞ ∑ ∑ ∑ = µ + ψ k ϵ t − k + ψ kl ϵ t − k ϵ t − l k = 0 k = 0 l = 0 ∞ ∞ ∞ ∑ ∑ ∑ + ψ klm ϵ t − k ϵ t − l ϵ t − m + . . . k = 0 l = 0 m = 0

  8. Volterra series Approximate the general model by superposition is lost! This results in generalized transfer functions. NOTE that (2) where Y t deterministic of stochastic! (1) µ = f ∗ ( 0 ) , ψ k = ( ∂ f ∗ ∂ 2 f ∗ ) , ψ kl = ( ) , ... ∂ϵ t − k ∂ϵ t − k ∂ϵ t − l ∞ ∞ ∞ ∑ ∑ ∑ = µ + ψ k ϵ t − k + ψ kl ϵ t − k ϵ t − l k = 0 k = 0 l = 0 ∞ ∞ ∞ ∑ ∑ ∑ + ψ klm ϵ t − k ϵ t − l ϵ t − m + . . . k = 0 l = 0 m = 0 These transfer functions does not care if { ϵ } is

  9. k is white noise? Frequency doubling Now assume that we introduce a spectral representation of the noise. This results in frequency doubling Proof by inserting the signal in Eq (2). Question: What happens with a non-linear system if the noise Conclusion: Black box non-linear system identification is far more complicated that linear system identification. ◮ Let’s start with a single frequency, ϵ k = A exp ( i ω ∗ k )

  10. k is white noise? Frequency doubling Now assume that we introduce a spectral representation of the noise. Question: What happens with a non-linear system if the noise Conclusion: Black box non-linear system identification is far more complicated that linear system identification. ◮ Let’s start with a single frequency, ϵ k = A exp ( i ω ∗ k ) ◮ This results in frequency doubling ◮ Proof by inserting the signal in Eq (2).

  11. Frequency doubling Now assume that we introduce a spectral representation of the noise. Conclusion: Black box non-linear system identification is far more complicated that linear system identification. ◮ Let’s start with a single frequency, ϵ k = A exp ( i ω ∗ k ) ◮ This results in frequency doubling ◮ Proof by inserting the signal in Eq (2). ◮ Question: What happens with a non-linear system if the noise ϵ k is white noise?

  12. Frequency doubling Now assume that we introduce a spectral representation of the noise. is far more complicated that linear system identification. ◮ Let’s start with a single frequency, ϵ k = A exp ( i ω ∗ k ) ◮ This results in frequency doubling ◮ Proof by inserting the signal in Eq (2). ◮ Question: What happens with a non-linear system if the noise ϵ k is white noise? ◮ Conclusion: Black box non-linear system identification

  13. Regime models The model is generated from a set of simple models ◮ SETAR ◮ STAR ◮ HMM

  14. SETAR - Self-Exciting Threshold AR 1 NOTE that it is difficult to estimate the boundaries for the (4) l . . . . . . 2 regimes t 0 k Jt i (3) The SETAR ( l ; d ; k 1 , k 2 , . . . , k l ) model is given by : ∑ Y t = a ( J t ) a ( J t ) Y t − i + ϵ ( J t ) + i = 1 where the index ( J t ) is described by  for Y t − d ∈ R 1    for Y t − d ∈ R 2  J t =     for Y t − d ∈ R l .

  15. SETAR - Self-Exciting Threshold AR 1 NOTE that it is difficult to estimate the boundaries for the (4) l . . . . . . 2 regimes t 0 k Jt i (3) The SETAR ( l ; d ; k 1 , k 2 , . . . , k l ) model is given by : ∑ Y t = a ( J t ) a ( J t ) Y t − i + ϵ ( J t ) + i = 1 where the index ( J t ) is described by  for Y t − d ∈ R 1    for Y t − d ∈ R 2  J t =     for Y t − d ∈ R l .

  16. SETARMA leading to SETARMA models. MA polynomials, e.g. ◮ Similar ideas can be included in ARMA models, ◮ Often easy to add ’asymmetric’ terms in the AR or ( ) c 1 + c ′ y n + a 1 y n − 1 = e n + 1 1 { e n − 1 ≤ 0 } e n − 1

  17. STAR - Smooth Threshold AR k (7) (6) considered, namely the logistic and exponential functions: distribution. zero and one, as for instance the standard Gaussian The STAR(k) model: threshold parameters (location parameters). k   ∑ ∑ Y t = a 0 + a j Y t − j +  b 0 +  G ( Y t − d )+ ϵ t (5) b j Y t − j j = 1 j = 1 where G ( Y t − d ) now is the transition function lying between In the literature two specifications for G ( · ) are commonly G ( y ) = ( 1 + exp ( − γ L ( y − c L ))) − 1 ; γ L > 0 G ( y ) = 1 − exp ( − γ E ( y − c E ) 2 ); γ E > 0 where γ L and γ E are transition parameters, c L and c E are

  18. PJM electricity market

  19. Prices at the PJM market

  20. Simple model of the power market (8) (9) where G is a transition function. and price P . ◮ Demand D ( Q ) = a + bQ + c cos ( 2 π t / 50 ) + ε ◮ Supply S ( Q ) = α 0 + β 0 Q + G ( Q , Q break )( α 1 + β 1 ( Q − Q break ) + ) ◮ Solve numerically for t = 1 , . . . to get the quantity Q

  21. Supply and Demand Figure: Supply and demand curves (varies across the season) for our artificial market 1000 900 800 Supply 700 MaxDemand MinDemand 600 500 400 300 200 100 0 50 60 70 80 90 100 110 120

  22. Prices Figure: Note the seasonality as well as the non-Gaussian distribution. 750 700 650 600 550 500 450 400 350 300 250 0 50 100 150 200 250

  23. Distribution of prices Figure: Same property Normal Probability Plot 0.999 0.997 0.99 0.98 0.95 0.90 0.75 Probability 0.50 0.25 0.10 0.05 0.02 0.01 0.003 0.001 300 350 400 450 500 550 600 650 700 Data

  24. HMM - Hidden Markov Models Another alternative is to let the regime shift stochastically, NOTE that parameter estimation is slightly more (10) t i complicated than before. k Jt 0 as in the Hidden Markov Model. Let ∑ Y t = a ( J t ) a ( J t ) Y t − i + ϵ ( J t ) + i = 1 where the state variable J t follows a latent Markov chain.

  25. HMM - Hidden Markov Models Another alternative is to let the regime shift stochastically, NOTE that parameter estimation is slightly more (10) t i complicated than before. k Jt 0 as in the Hidden Markov Model. Let ∑ Y t = a ( J t ) a ( J t ) Y t − i + ϵ ( J t ) + i = 1 where the state variable J t follows a latent Markov chain.

  26. Case: Electricity spot price, (Regland & Lindström, 2012) difference between the logarithm of the spot and the logarithm Figure: The electricity spot price (left) and spread, defined as the of the forward (right). Data from the German EEX market. The electricity spot price is very non-Gaussian EEX spot EEX log(spot)−log(forward) 250 1 200 0 150 −1 100 −2 50 −3 0 −4 Feb05 Feb07 Feb09 Feb05 Feb07 Feb09

  27. there are still bursts of volatility. HMM regime switching model with three states, a normal state with mean-reverting dynamics, a spike (upward jumps) state and a drop (downward jumps) state. ◮ The spread accounts for virtually all seasonality, but ◮ The logarithm of the spot, y t , was modeled using a

  28. This is mathematically given by : ahead forward price. 0 0 The regimes are switching according to a Markov chain t ( ) ∆ y ( B ) µ t − y ( B ) t + 1 = α + σϵ t y ( S ) t + 1 = Z S , t + µ t , Z S ∼ F ( µ S , σ S ) y ( D ) t + 1 = − Z D , t + µ t , Z D ∼ F ( µ D , σ D ) where µ t is approximately the logarithm of the month R t = { B , S , D } governed by the transition matrix   1 − π BU − π BD π BS π BD  . Π = π SB 1 − π SB  π DB 1 − π DB

Recommend


More recommend