nepr208 adaptation properties and mechanisms functional
play

NEPR208 - Adaptation properties and mechanisms Functional - PowerPoint PPT Presentation

NEPR208 - Adaptation properties and mechanisms Functional advantages in properties of a neural code and changes in those properties What is adaptation? Why do neural systems have a particular nonlinear and filter? Why does the nonlinearity and


  1. NEPR208 - Adaptation properties and mechanisms Functional advantages in properties of a neural code and changes in those properties What is adaptation? Why do neural systems have a particular nonlinear and filter? Why does the nonlinearity and filter change? A hierarchy of systems What biophysical mechanisms can cause adaptation? Receptive Field Spatiotemporal Filter Linear-Nonlinear (LN) Model Time Preferred Visual feature

  2. What happens when stimulus statistics change? Intensity

  3. What happens when stimulus statistics change? Intensity

  4. Ganglion cell response curves shift to the mean light intensity Sakmann and Creuzfeldt, Scotopic and mesopic light adaptation in the cat ’ s retina (1969)

  5. Neurons have a limited dynamic range set by maximum and minimum output levels, and by noise

  6. Events with Poisson statistics ) = e − µ µ n µ = mean # of events in a time interval ( ( ) = µ P X = n | E X n ! n = events in a time interval Joint probability distribution P[ n,µ ] Expected frequency # of events variance=mean= µ

  7. How to Model Noise? Poisson distribution Independent events occurring at an average rate. Photons σ 2 = µ Spiking Gaussian distribution Sum of many independent processes through central limit theorem. Membrane potential noise σ 2 = constant parameter Binomial distribution n independent outcomes, each with probability p. Channel gating σ 2 = np 1 − p ( ) Vesicle fusion Approximated by Poisson distribution at low probability p

  8. Nonlinearity

  9. Turtle Cones: Sensitivity and Kinetics change with mean luminance Dim background Bright background Baylor & Hodgkin 1974

  10. Signal with poisson distribution Rate Filtered 0.2 2 0.1 0.1 0 0.0 10 1 0 40 10 0 200 100 0 2000 1000 0 0 2000 2000 4000 Time (s) Time (s)

  11. What receptive field maximizes information transmission? Retinal bipolar cell receptive field Baccus, Olveczky, Manu & Meister, 2008

  12. A Mathematical Theory of Communication Claude Shannon (1948) Entropy – measure of uncertainty in bits Bit – unit of information Mutual information – What you can learn about one signal by observing another Bell System Technical Journal 27, 379-423

  13. Theory of maximizing information in a noisy neural system ‘Efficient Coding’ - Horace Barlow Natural visual scenes are dominated by low spatial and temporal frequencies J.H. van Hateren. Real and optimal neural images in early vision. Nature 360:68-70 (1992) J.H. van Hateren, Spatiotemporal contrast sensitivity of early vision. Vision Res. , 33:257-67 (1993)

  14. Linear filter and frequency response Stimulus Filter Response 0 20 40 60 80 100 0 500 1000 0 500 1000 Convolution theorem ~ ~ ~ h ( t ) = f ( t ) ∗ g ( t ) ( ) = f ( ) g ( ) ⇔ h ω ω ω is a simple product in the a convolution in the frequency domain time domain

  15. Optimal filter whitens but also cuts out noise ∝ Stimulus 1/ f Noise ‘ Whitening ’ filter Filter Response Noise

  16. Theory of maximizing information in a noisy neural system Filter of fly Large Monopolar Cells, 2nd order visual neuron Low background intensity Integrates over time (real and theoretical optimum) High background intensity Emphasizes change, is more differentiating (real and theoretical optimum) Both, scaled in time to the first peak J.H. van Hateren. Real and optimal neural images in early vision. Nature 360:68-70 (1992)

  17. Spatial adaptation in retinal ganglion cells Barlow, Fitzhugh & Kuffler (1957)

  18. Theories of efficient coding: An ideal encoder should use all output values with equal probability Low frequencies dominate in natural scenes An efficient encoder should amplify higher frequencies more than low frequencies But when signals are more noisy, such as when the signal is weak, higher frequencies should be reduced, as they carry little information

  19. A Simple Model of Visual Responses Linear-Nonlinear (LN) Model Receptive Field Spatiotemporal Cone Filter Synaptic w 1 w 2 w 3 weights Bipolar cell Time Preferred Visual feature Ganglion cell

  20. Deep Learning Object Recognition

  21. “Convolutional” layer Like a mosaic of retinal neurons

  22. Multiple cell types A Multiple cell types Filter Scale Threshold × Like the multiple cell L types in the retina

  23. Multiple Layers A Multiple cell types Filter Scale Threshold × L Like the multiple cell types in the retina Like the hierarchy of retinal circuitry

  24. Object Recognition Deep Network

  25. LN Model Spatiotemporal Filter More Computationally Interpretable Time LN-LN Model Different sensory models + for different questions Minimal Convolutional Neural Network Convolutional Convolutional Dense Stimulus Layer 1 Layer 2 layer More Mechanistically + Interpretable … Deep CNN More Expressive

  26. Adaptation to mean and variance Intensity

  27. Why study biophysical mechanisms? Biophysics provides Biophysics provides a constraints tool kit Function Mechanisms Higher level function Computations Theory Lower level mechanisms

  28. Change in sensitivity by modulation Feedforward Feedback Spike dependent conductances inhibition inhibition Change in sensitivity by depletion Short-term synaptic plasticity Ion channel Receptor synaptic depression inactivation desensitization Post Pre

  29. ‘Equivalent’ circuit model of a neuron I c Inside I K V G K C I e Outside E K + I K + I c = I e C dV G K = 1 R K = I e − G K ( V − E K ) dt 36

  30. A mathematical model of a neuron g Na g L g K V C E K E L E Na + I e + + C dV ( ) ∑ ∑ ( ) ( ) − dt = = I e t g i V − E i I n t n i 100 Vm Alan Hodgkin 0 Andrew Huxley, 1952 − 100 0 100 200 300 Time (ms) 0.1 0 37 0 100 200 300 Time (ms)

  31. Ionic currents (Time dependence) activation inactivation activation G Na ( t ) G K ( t ) Time after start Time after start of pulse (ms) of pulse (ms) ( ) ( ) I K ( t ) = G K ( V , t ) ⋅ V − E K I Na ( t ) = G Na ( V , t ) ⋅ V − E Na 38

  32. Rate constants State Closed Open [ ] k on L Occupancies (sum to 1) A R k off Change in activity = Inflow – Outflow dA [ ] k on − Ak off dt = R L Kinetic model

  33. Input and output in a kinetic model Closed Open [ ] k on L A R k off Active 1 Occupancy Fractional Resting 0 [L] k on constant (1/s) k off Rate 0 Time

  34. Steps for computing the model, focusing on K+ current: C dV ( t ) ( ) ⋅ ( V − E K )... = I e ( t ) − G K n 4 t , α n , β n dt Max cond. state variable Closed Open ( ) α V ( ) 1 − n ( ) −β V ( ) n dn dt = α V n 1-n ( ) β V 41

  35. Steps for computing the model, focusing on K+ current: C dV ( t ) ( ) ⋅ ( V − E K )... = I e ( t ) − G K n 4 t , α n , β n dt Closed Open ( ) α V ( ) 1 − n ( ) −β V ( ) n dn dt = α V n 1-n ( ) β V Start with at time step t V m Compute rate constants as a function of V m dn Compute and integrate one time step to get n ( t ) dt Compute K current: I K = G K n 4 ( V − E K ) Compute total membrane current: I m = I K + I Na + I L dV m Integrate to get at next time step V m dt 42

  36. Hodgkin Huxley Model Voltage state variable (membrane equation): C dV ( ) − g Na m 3 h V − E Na ( ) − g l V − E l ( ) dt = I ( t ) − g K n 4 V − E K Conductance state variables: Rate “constants”: Constants: E K Note: these are given in E Na the original form, E l relative to V rest =~ -66 mV

  37. Adaptation to the mean and variance of signals are similar in a number of systems The kinetics and gain of the response change when the stimulus statistics change These adaptive properties can be interpreted as avoiding saturation and maximizing information in the presence of noise Many mechanisms can contribute these adaptive nonlinear properties at different timescales

Recommend


More recommend