Spiking neural models: from point processes to partial differential equations. Julien Chevallier Co-workers: M. J. Càceres, M. Doumic and P. Reynaud-Bouret LJAD University of Nice INRIA Sophia-Antipolis 2016/06/09
Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary Outline 1 Introduction 2 A key tool: The thinning procedure 3 First approach: Mathematical expectation 4 Second approach: Mean-field interactions
Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary Outline 1 Introduction Neurobiologic context Microscopic modelling Macroscopic modelling 2 A key tool: The thinning procedure 3 First approach: Mathematical expectation 4 Second approach: Mean-field interactions
Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary Biological context Action potential +40 Voltage (mV) Depolarization R e 0 p o l a r i z a t i o Failed n Threshold -55 initiations Resting state -70 Stimulus Refractory period 0 1 2 3 4 5 Time (ms) Neurons = electrically excitable cells. Action potential = spike of the electrical potential. Physiological constraint: refractory period.
Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary Biological context microscopic scale Action potential +40 Voltage (mV) Depolarization Repolarization 0 Threshold Failed -55 initiations Resting state -70 Stimulus Refractory period 0 1 2 3 4 5 Time (ms) Neurons = electrically excitable cells. Action potential = spike of the electrical potential. Physiological constraint: refractory period.
Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary Biological context microscopic scale Action potential +40 Voltage (mV) Depolarization Repolarization 0 Threshold Failed -55 initiations Resting state -70 Stimulus Refractory period 0 1 2 3 4 5 Time (ms) Neurons = electrically excitable cells. Action potential = spike of the electrical potential. Physiological constraint: refractory period.
Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary Biological context microscopic scale . Action potential +40 Voltage (mV) Depolarization . Repolarization 0 Threshold Failed -55 . initiations Resting state -70 Stimulus Refractory period 0 1 2 3 4 5 Time (ms) Neurons = electrically excitable cells. Action potential = spike of the electrical potential. Physiological constraint: refractory period.
Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary Biological context microscopic scale macroscopic scale . Action potential +40 Voltage (mV) Depolarization . Repolarization 0 Threshold Failed -55 . initiations Resting state -70 Stimulus Refractory period 0 1 2 3 4 5 Time (ms) Neurons = electrically excitable cells. Action potential = spike of the electrical potential. Physiological constraint: refractory period.
Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary Biological context microscopic scale macroscopic scale . Action potential +40 Voltage (mV) Depolarization . Repolarization 0 Threshold Failed -55 . initiations Resting state -70 Stimulus Refractory period 0 1 2 3 4 5 Time (ms) Neurons = electrically excitable cells. Action potential = spike of the electrical potential. Physiological constraint: refractory period.
Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary Biological context microscopic scale macroscopic scale . Action potential +40 Voltage (mV) Depolarization . Repolarization 0 Threshold Failed -55 . initiations Resting state -70 Stimulus Refractory period 0 1 2 3 4 5 Time (ms) Neurons = electrically excitable cells. Action potential = spike of the electrical potential. Physiological constraint: refractory period.
Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary Microscopic modelling Microscopic modelling of spike trains Time point processes = random countable sets of times (points of R or R + ). Point process: N = { T i , i ∈ Z } s.t. ··· < T 0 ≤ 0 < T 1 < ··· . � f ( t ) N ( dt ) = ∑ i ∈ Z f ( T i ) . Point measure: N ( dt ) = ∑ i ∈ Z δ T i ( dt ) . Hence,
Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary Microscopic modelling Microscopic modelling of spike trains Time point processes = random countable sets of times (points of R or R + ). Point process: N = { T i , i ∈ Z } s.t. ··· < T 0 ≤ 0 < T 1 < ··· . � f ( t ) N ( dt ) = ∑ i ∈ Z f ( T i ) . Point measure: N ( dt ) = ∑ i ∈ Z δ T i ( dt ) . Hence, Age process: ( S t − ) t ≥ 0 . Age = delay since last spike.
Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary Microscopic modelling Microscopic modelling of spike trains Time point processes = random countable sets of times (points of R or R + ). Point process: N = { T i , i ∈ Z } s.t. ··· < T 0 ≤ 0 < T 1 < ··· . � f ( t ) N ( dt ) = ∑ i ∈ Z f ( T i ) . Point measure: N ( dt ) = ∑ i ∈ Z δ T i ( dt ) . Hence, Age process: ( S t − ) t ≥ 0 . Stochastic intensity Heuristically, � � 1 N ([ t , t +∆ t ]) = 1 | F N λ t = lim ∆ t P , t − ∆ t → 0 where F N t − denotes the history of N before time t . Local behaviour: probability to find a new spike. May depend on the past (e.g. refractory period, aftershocks).
Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary Some classical point processes in neuroscience Poisson process: λ t = λ ( t ) (deterministic, no refractory period).
Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary Some classical point processes in neuroscience Poisson process: λ t = λ ( t ) (deterministic, no refractory period). Renewal process: λ t = f ( S t − ) ⇔ i.i.d. ISIs. (refractory period)
Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary Some classical point processes in neuroscience Poisson process: λ t = λ ( t ) (deterministic, no refractory period). Renewal process: λ t = f ( S t − ) ⇔ i.i.d. ISIs. (refractory period) � t − Linear Hawkes process: λ t = µ + h ( t − z ) N ( dz ) , h ≥ 0 . 0
Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary Some classical point processes in neuroscience Poisson process: λ t = λ ( t ) (deterministic, no refractory period). Renewal process: λ t = f ( S t − ) ⇔ i.i.d. ISIs. (refractory period) � t − Linear Hawkes process: λ t = µ + h ( t − z ) N ( dz ) , h ≥ 0 . 0
Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary Some classical point processes in neuroscience Poisson process: λ t = λ ( t ) (deterministic, no refractory period). Renewal process: λ t = f ( S t − ) ⇔ i.i.d. ISIs. (refractory period) � t − Linear Hawkes process: λ t = µ + h ( t − z ) N ( dz ) , h ≥ 0 . 0 � �� � ∑ h ( t − T ) T ∈ N T < t
Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary Some classical point processes in neuroscience Poisson process: λ t = λ ( t ) (deterministic, no refractory period). Renewal process: λ t = f ( S t − ) ⇔ i.i.d. ISIs. (refractory period) � t − Linear Hawkes process: λ t = µ + h ( t − z ) N ( dz ) , h ≥ 0 . 0 � �� � ∑ h ( t − T ) T ∈ N T < t
Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary Some classical point processes in neuroscience Poisson process: λ t = λ ( t ) (deterministic, no refractory period). Renewal process: λ t = f ( S t − ) ⇔ i.i.d. ISIs. (refractory period) � t − Linear Hawkes process: λ t = µ + h ( t − z ) N ( dz ) , h ≥ 0 . 0 � �� � ∑ h ( t − T ) T ∈ N T < t
Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary Age structured equations (K. Pakdaman, B. Perthame, D. Salort, 2010) Age = delay since last spike. � probability density of finding a neuron with age s at time t . n ( t , s ) = ratio of the neural population with age s at time t . ∂ n ( t , s ) + ∂ n ( t , s ) + p ( s , X ( t )) n ( t , s ) = 0 ∂ t ∂ s (PPS) � + ∞ mean firing rate → n ( t , 0 ) = p ( s , X ( t )) n ( t , s ) ds . 0
Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary Age structured equations (K. Pakdaman, B. Perthame, D. Salort, 2010) Age = delay since last spike. � probability density of finding a neuron with age s at time t . n ( t , s ) = ratio of the neural population with age s at time t . ∂ n ( t , s ) + ∂ n ( t , s ) + p ( s , X ( t )) n ( t , s ) = 0 ∂ t ∂ s (PPS) � + ∞ mean firing rate → n ( t , 0 ) = p ( s , X ( t )) n ( t , s ) ds . 0 Parameters rate function p . For example, p ( s , X ) = 1 { s > σ ( X ) } . � t X ( t ) = 0 d ( t − x ) n ( x , 0 ) dx (global neural activity) d = delay function. For example, d ( x ) = e − τ x . Propagation time.
Recommend
More recommend