combining biophysical and statistical methods for
play

Combining biophysical and statistical methods for understanding - PowerPoint PPT Presentation

Combining biophysical and statistical methods for understanding neural codes Liam Paninski Department of Statistics and Center for Theoretical Neuroscience Columbia University http://www.stat.columbia.edu/ liam liam@stat.columbia.edu


  1. Combining biophysical and statistical methods for understanding neural codes Liam Paninski Department of Statistics and Center for Theoretical Neuroscience Columbia University http://www.stat.columbia.edu/ ∼ liam liam@stat.columbia.edu October 22, 2007 Support: NIH CRCNS award, Sloan Research Fellowship, NSF CAREER award.

  2. The neural code Input-output relationship between • External observables x (sensory stimuli, motor responses...) • Neural variables y (spike trains, population activity...) Probabilistic formulation: p ( y | x )

  3. Basic goal ...learning the neural code. Fundamental question: how to estimate p ( y | x ) from experimental data? General problem is too hard — not enough data, too many inputs x and spike trains y

  4. Avoiding the curse of insufficient data Many approaches to make problem tractable: 1 : Estimate some functional f ( p ) instead e.g., information-theoretic quantities (Paninski, 2003) 2 : Select stimuli as efficiently as possible (Machens, 2002; Paninski, 2005; Lewi et al., 2006) 3: Fit a model with small number of parameters

  5. Neural encoding models “Encoding model”: p θ ( y | x ). — Fit parameter θ instead of full p ( y | x ) Main theme: want model to be flexible but not overly so Flexibility vs. “fittability”

  6. Multiparameter HH-type model — highly biophysically plausible, flexible — but very difficult to estimate parameters given spike times alone (figure adapted from (Fohlmeister and Miller, 1997))

  7. Cascade (“LNP”) model — easy to estimate via correlation-based methods (Simoncelli et al., 2004) — but not biophysically plausible (fails to capture spike timing details: refractoriness, burstiness, adaptation, etc.)

  8. Two key ideas 1. Use likelihood-based methods for fitting. — well-justified statistically — easy to incorporate prior knowledge, explicit noise models, etc. 2. Use models that are easy to fit via maximum likelihood — concave (downward-curving) functions have no non-global local maxima = ⇒ concave functions are easy to maximize by gradient ascent. Recurring theme: find flexible models whose loglikelihoods are guaranteed to be concave.

  9. Filtered integrate-and-fire model � 0 � − g ( t ) V ( t ) + I DC + � � dV ( t ) = k · � x ( t ) + h ( t − t j ) dt + σdN t ; j = −∞ (Paninski et al., 2004b)

  10. Model flexibility: Adaptation

  11. The estimation problem (Paninski et al., 2004b)

  12. First passage time likelihood P (spike at t i ) = fraction of paths crossing threshold for first time at t i (via Fokker-Planck, integral equation, or EM; (Paninski et al., 2004b; Paninski et al., 2007; Nikitchenko and Paninski, 2007))

  13. Maximizing likelihood Maximization seems difficult, even intractable: — high-dimensional parameter space — likelihood is a complex nonlinear function of parameters Main result : The loglikelihood is concave in the parameters, no matter what data { � x ( t ) , t i } are observed. = ⇒ no non-global local maxima = ⇒ maximization easy by ascent techniques.

  14. Application: retinal ganglion cells Preparation: dissociated macaque retina — extracellularly-recorded responses of populations of RGCs Stimulus: random “flicker” visual stimuli

  15. Spike timing precision in retina RGC LNP IF 0 0.25 0.5 0.75 1 0.07 0.17 0.22 0.26 RGC rate (sp/sec) LNP 200 IF 0 1.5 variance (sp 2 /bin) 1 0.5 0 0 0.25 0.5 0.75 1 0.6 0.64 0.85 0.9 (Pillow et al., 2005)

  16. Linking spike reliability and subthreshold noise (Pillow et al., 2005)

  17. Likelihood-based discrimination Given spike data, optimal decoder chooses stimulus � x according to likelihood: p ( spikes | � x 1 ) vs. p ( spikes | � x 2 ). Using accurate model is essential (Pillow et al., 2005)

  18. Example 2: decoding subthreshold activity Given extracellular spikes, what is most likely intracellular V ( t )? 10 0 −10 −20 V (mV) −30 −40 −50 −60 −70 ? −80 5.1 5.15 5.2 5.25 5.3 5.35 5.4 5.45 5.5 5.55 time (sec)

  19. Computing V ML ( t ) Loglikelihood of V ( t ) (given LIF parameters, white noise N t ): � T �� 2 L ( { V ( t ) } 0 ≤ t ≤ T ) = − 1 � � ˙ V ( t ) − − gV ( t ) + I ( t ) dt 2 σ 2 0 Constraints: • Reset at t = 0: V (0) = V reset • Spike at t = T : V ( T ) = V th • No spike for 0 < t < T : V ( t ) < V th Quadratic programming problem: optimize quadratic function under linear constraints. Concave : unique global optimum.

  20. Application: in vitro data Recordings: rat sensorimotor cortical slice; dual-electrode whole-cell Stimulus: Gaussian white noise current I ( t ) Analysis: fit IF model parameters { g,� k, h ( . ) , V th , σ } by maximum likelihood (Paninski et al., 2003; Paninski et al., 2004a), then compute V ML ( t )

  21. Application: in vitro data true V(t) 0 V ML (t) −20 V (mV) −40 −60 1.04 1.05 1.06 1.07 1.08 1.09 1.1 1.11 1.12 1.13 −45 −50 −55 V (mV) −60 −65 −70 −75 1.04 1.05 1.06 1.07 1.08 1.09 1.1 1.11 1.12 1.13 time (sec) (Applications to spike-triggered average (Paninski, 2006a; Paninski, 2006b).)

  22. Part 3: Back to detailed models Can we recover detailed biophysical properties? • Active: membrane channel densities • Passive: axial resistances, “leakiness” of membranes • Dynamic: spatiotemporal synaptic input

  23. Spatiotemporal voltage recordings Djurisic et al, 2004

  24. Conductance-based models Key point: if we observe full V i ( t ) + cell geometry, channel kinetics known + current noise is log-concave, then loglikelihood of unknown parameters is concave. Gaussian noise = ⇒ standard nonnegative regression (albeit high-d).

  25. Estimating channel densities from V ( t ) (Huys et al., 2006)

  26. Estimating channel densities from V ( t ) 0 −20 V −40 −60 50 summed currents dV/dt 0 −50 −100 20 40 60 80 100 Time [ms] conductance [mS/cm 2 ] 100 True Inferred 50 0 NaHH KHH Leak NaM KM NaS KAS

  27. Estimating non-homogeneous channel densities and axial resistances from spatiotemporal voltage recordings

  28. Estimating synaptic inputs given V ( t )

  29. Estimating synaptic inputs given V ( t ) B A Synaptic conductances Channel conductances 120 True parameters max conductance [mS/cm 2 ] 1 (spikes and conductances) 100 Data (voltage trace) Inferred (MAP) spikes Inferred (ML) channel densities 80 Inh spikes | Voltage [mV] | Exc spikes 60 0 40 20 mV 20 −25 mV 0 HHNa HHK Leak MNa MK SNa SKA SKDR C 1 −70 mV 1 0 20 mV −25 mV −70 mV 1 0 0 1280 1300 1320 1340 1360 1380 1400 0 500 1000 1500 2000 Time [ms] Time [ms]

  30. Estimating stimulus effects dV/dt = I channel + � k · � x ( t ) + σN t A 2 s 1 (t) 0 −2 B 20 0 −20 V −40 −60 C summed currents 150 100 dV/dt 50 0 −50 20 40 60 80 100 Time [ms] D E 120 4 True True 100 3.5 Inferred Inferred conductance 80 3 [mS/cm 2 ] filter 60 2.5 2 40 1.5 20 0 NaHH KHH Leak NaM KM NaS KAS 1 2 3 4 5 6 7 8 9 10

  31. Dealing with incomplete observations: Kalman filter −59.5 V (mV) −60 −60.5 −59.6 E[V(t) | Y(0:t)] −59.8 V (mV) E[V(t) | Y(1:T)] −60 −60.2 −60.4 −60.6 est std (mV) 0.2 0.1 0 0 0.02 0.04 0.06 0.08 0.1 0.12 0.14 0.16 0.18 0.2 t (sec)

  32. Smoothing given nonlinear dynamics — via particle filtering (Huys and Paninski, 2006)

  33. Subsampling and noise

  34. Inferring spike rates from calcium observations (Vogelstein et al., 2007)

  35. Inferring spike rates from calcium observations

  36. Conclusions Advantages of model-based approach: • Flexibility of generative probabilistic framework • Direct biophysical interpretability of estimated parameters • Connections to statistical decoding methods, optimal experimental design (Paninski et al., 2008) • Direct quantification of uncertainty Next steps: • Further applications to data • Further relaxation of assumptions

  37. Collaborators Theory and numerical methods • Y. Ahmadian, S. Escola, G. Fudenberg, Q. Huys, J. Kulkarni, M. Nikitchenko, K. Rahnama, G. Szirtes, T. Toyoizumi, Columbia • E. Simoncelli, NYU • A. Haith, C. Williams, Edinburgh • M. Ahrens, J. Pillow, M. Sahani, Gatsby • J. Lewi, Georgia Tech • J. Vogelstein, Johns Hopkins Retinal physiology • E.J. Chichilnisky, J. Shlens, V. Uzzell, Salk Cortical in vitro physiology • B. Lau and A. Reyes, NYU

Recommend


More recommend