hawkes processes with stochastic excitations
play

Hawkes Processes with Stochastic Excitations Young Lee , Kar Wai - PowerPoint PPT Presentation

Hawkes Processes with Stochastic Excitations Young Lee , Kar Wai Lim , Cheng Soon Ong National ICT of Australia & London School of Economics National ICT of Australia & Australian National University Lee, Lim and Ong


  1. Hawkes Processes with Stochastic Excitations Young Lee ∗ , Kar Wai Lim † , Cheng Soon Ong † ∗ National ICT of Australia & London School of Economics † National ICT of Australia & Australian National University Lee, Lim and Ong Stochastic Hawkes June 21, 2016 1 / 22

  2. Outline 1 Motivation for Stochastic Hawkes 2 Simulation and Inference 3 Experimental Result 4 Summary Lee, Lim and Ong Stochastic Hawkes June 21, 2016 2 / 22

  3. Motivation for Stochastic Hawkes Background Simple point processes: ( T i ) i a sequence of non-negative random variables such that T i < T i +1 . Also known as random times. Counting processes: Given simple point process ( T i ) i � N ( t ) = 1 T i ≤ t i > 0 is called the counting process associated with T . Interarrival times: The process ∆ defined by ∆ i = T i − T i − 1 is called the interarrival times associated with T . Intensity process: The intensity process is defined as 1 λ ( t ) = lim hE [ N ( t + h ) − N ( t ) |F t ] h → 0 Lee, Lim and Ong Stochastic Hawkes June 21, 2016 3 / 22

  4. Motivation for Stochastic Hawkes Recap: Poisson → Hawkes → Stochastic Hawkes N t as the number of arrivals or events of the process by time t . λ = const . ( Poisson ), does not take the history of events into account. However, if an arrival causes the intensity function to increase then the process is said to be self-exciting (Hawkes Process). Hawkes flavour: � λ ( t ) = ˆ λ 0 ( t ) + Y ( T i ) ν ( t − T i ) , (1) i : t > T i where the function ν takes the form ν ( z ) = e − δ z . ∃ different formulations for Y Constant, Hawkes (1971), Hawkes & Oakes (1974) 1 Random excitations, Br´ emaud & Massouli´ e (2002), Dassios & Zhao (2013), 2 Stochastic differential equations. 3 Lee, Lim and Ong Stochastic Hawkes June 21, 2016 4 / 22

  5. Motivation for Stochastic Hawkes Illustration of Stochastic Hawkes λ ( t ) Z 32 = 1 Note the variation of heights with Cov ( Y 5 , Y 6) � = 0 Z 10 = 1 Y 6 Z 20 = 1 Y 5 T 1 T 2 T 3 T 4 T 5 T 6 T 7 Lee, Lim and Ong Stochastic Hawkes June 21, 2016 5 / 22 Figure: A sample path of the intensity function λ ( · ).

  6. Motivation for Stochastic Hawkes Our model The intensity function � ˆ λ ( t ) = λ 0 ( t ) + Y ( T i ) ν ( t − T i ) � �� � � �� � i : t > T i Base intensity Contagion process / Levels of excitation where ˆ λ 0 : R �→ R + is a deterministic base intensity, Y is a stochastic process and ν : R + �→ R + conveys the positive influence of the past events T i on the current value of the intensity process. Base intensity ˆ λ 0 Contagion process / Levels of excitation ( Y i ) i =1 , 2 ,.., N T measure the impact of clustering of the event times We take ν to be the exponential kernel of the form ν ( t ) = e − δ t . Lee, Lim and Ong Stochastic Hawkes June 21, 2016 6 / 22

  7. Motivation for Stochastic Hawkes Stochastic differential equations to describe evolution of Y Changes in the levels of excitation Y is assumed to satisfy � · � · Y · = µ ( t , Y t ) dt + ˆ ˆ σ ( t , Y t ) dB t 0 0 where B is a standard Brownian motion and t ∈ [0 , T ] where T < ∞ . Standing assumption: Y t > 0 , ∀ t ≥ 0 . Geometric Brownian Motion (GBM): Exponential Langevin: Lee, Lim and Ong Stochastic Hawkes June 21, 2016 7 / 22

  8. Motivation for Stochastic Hawkes Two representations for Stochastic Hawkes Intensity based. N t � λ t = a + ( λ 0 − a ) e − δ t + Y i e − δ ( t − T i ) (2) i : T i < t Cluster based. Immigrants and offsprings. We say an event time T i is an immigrant if it is generated from the base intensity a + ( λ 0 − a ) e − δ t , 1 otherwise we say T i is an offspring . 2 It is natural to introduce a variable that describes the specific process to which each event time T i corresponds to. Z i 0 = 1 if event i is an immigrant, Z ij = 1 if event i is an offspring of j Lee, Lim and Ong Stochastic Hawkes June 21, 2016 8 / 22

  9. Motivation for Stochastic Hawkes Quick recap - Stochastic Hawkes λ ( t ) Z 32 = 1 Note the variation of heights with Cov ( Y 5 , Y 6) � = 0 Z 10 = 1 Y 6 Z 20 = 1 Y 5 T 1 T 2 T 3 T 4 T 5 T 6 T 7 Lee, Lim and Ong Stochastic Hawkes June 21, 2016 9 / 22 Figure: A sample path of the intensity function λ ( · ).

  10. Simulation and Inference Outline 1 Motivation for Stochastic Hawkes 2 Simulation and Inference 3 Experimental Result 4 Summary Lee, Lim and Ong Stochastic Hawkes June 21, 2016 10 / 22

  11. Simulation and Inference Simulation & Inference Simulation framework of Dassios & Zhao (2011) is adopted, Decompose the inter-arrival event times into two independent simpler random variables: S (1) , S (2) ; S j +1 is the inter-arrival time for the ( j + 1)-th jump: S j +1 = T j +1 − T j . Given the intensity function, we can derive the cumulative density function for S j +1 as � � � 1 − e − δ s � F S j +1 ( s ) = 1 − exp − λ T + j − a − as . δ Decompose S j +1 into S (1) j +1 and S (2) j +1 : � 1 − e − δ s � � � × e − as P ( S j +1 > s ) = exp − λ T + j − a δ � � � � S (1) S (2) = P j +1 > s × P j +1 > s � � � � S (1) j +1 , S (2) = P > s . min j +1 Lee, Lim and Ong Stochastic Hawkes June 21, 2016 11 / 22

  12. Simulation and Inference Simulation & Inference � 1 − e − δ s � � � � � S (1) F S (1) j +1 ( s ) = P j +1 ≤ s = 1 − exp − λ T + j − a , δ � � S (2) = 1 − e − as . j +1 ( s ) = P j +1 ≤ s F S (2) for 0 ≤ s < ∞ . To simulate S j +1 , we simply need to independently simulate both S (1) j +1 and S (2) j +1 . Simulating S (2) j +1 is trivial since S (2) j +1 follows an exponential distribution with rate parameter a . To simulate S (1) j +1 , we use the inverse CDF approach: � � � λ T + j − a � j +1 = − 1 1 + δ ln( v ) S ∗ δ ln if exp − ≤ v < 1 , λ T + j − a δ λ T + � j − a � we discard S ∗ j +1 otherwise, that is, v < exp − (this corresponds to the δ defective part), where v is simulated from a standard uniform distribution V ∼ U (0 , 1). Lee, Lim and Ong Stochastic Hawkes June 21, 2016 12 / 22

  13. Simulation and Inference Simulation & Inference Inference - Hybrid of MH and Gibbs The employment of branching representation enables the use of Gibbs sampling to learn Z , µ and σ , Other parameters a , λ 0 , k and Y are learned with the vanilla MH algorithm. Lee, Lim and Ong Stochastic Hawkes June 21, 2016 13 / 22

  14. Experimental Result Outline 1 Motivation for Stochastic Hawkes 2 Simulation and Inference 3 Experimental Result 4 Summary Lee, Lim and Ong Stochastic Hawkes June 21, 2016 14 / 22

  15. Experimental Result Synthetic validation Inference algorithm is first tested on synthetic data generated from Stochastic Hawkes Event times are generated assuming Y follows iid Gamma, GBM or Exponential Langevin, Performing experiments to recalibrate the parameters and subsequently sample the posterior Y gives the following interesting results Lee, Lim and Ong Stochastic Hawkes June 21, 2016 15 / 22

  16. Experimental Result Inference learns Gamma ground truth Ground truth Y Gamma 2.5 2.5 2 2 1.5 1.5 Y t Y t 1 1 0.5 0.5 0 0 50 100 150 200 250 300 350 400 450 50 100 150 200 250 300 350 400 450 Time t Time t GBM Exp Langevin 2.5 2.5 2 2 1.5 1.5 Y t Y t 1 1 0.5 0.5 0 0 50 100 150 200 250 300 350 400 450 50 100 150 200 250 300 350 400 450 Time t Time t All seems good. Lee, Lim and Ong Stochastic Hawkes June 21, 2016 16 / 22

  17. Experimental Result Inference learns G.B.M. Ground truth Y Gamma 2.5 2.5 2 2 1.5 1.5 Y t Y t 1 1 0.5 0.5 0 0 50 100 150 200 250 300 350 400 450 50 100 150 200 250 300 350 400 450 Time t Time t GBM Exp Langevin 2.5 2.5 2 2 1.5 1.5 Y t Y t 1 1 0.5 0.5 0 0 50 100 150 200 250 300 350 400 450 50 100 150 200 250 300 350 400 450 Time t Time t iid Gamma fails, but a posteriori trying to capture a downward trend. GBM learns well. Exp Langevin too!! Lee, Lim and Ong Stochastic Hawkes June 21, 2016 17 / 22

  18. Experimental Result Japanese Earthquakes Data (Di Giacomo et. al 2015) Plot of Y vs time: Japanese Earthquakes Sample Autocorrelation Function 4 1 3.5 0.8 3 0.6 Sample Autocorrelation 2.5 0.4 2 Y t 0.2 1.5 0 1 −0.2 0.5 0 −0.4 0 50 100 150 200 250 0 5 10 15 20 Time t Lag Y might not be iid as earthquake occurrence tend to be correlated. Geophysical TS are frequently autocorrelated because of inertia or carryover processes in physical system. Autocorrelations should be near-zero for randomness, else will be significantly non-zero Lee, Lim and Ong Stochastic Hawkes June 21, 2016 18 / 22

  19. Experimental Result Autocorrelation functions - SDEs retrieve correlated Y Ground truth Y Gamma Sample Autocorrelation Function Sample Autocorrelation Function 1 0.8 0.8 0.6 Sample Autocorrelation Sample Autocorrelation 0.6 0.4 0.4 0.2 0.2 0 0 −0.2 −0.4 −0.2 0 5 10 15 20 0 5 10 15 20 Lag Lag GBM Exp Langevin Sample Autocorrelation Function Sample Autocorrelation Function 0.8 0.8 Sample Autocorrelation Sample Autocorrelation 0.6 0.6 0.4 0.4 0.2 0.2 0 0 −0.2 −0.2 0 5 10 15 20 0 5 10 15 20 Lag Lag Lee, Lim and Ong Stochastic Hawkes June 21, 2016 19 / 22

Recommend


More recommend