9.54 Shimon Ullman + Tomaso Poggio Danny Harari + Daniel Zysman + Darren Seibert 9.54, fall semester 2014
9.54 class 3 Biophysics of Computation: Synapses, dendritic trees, computational primitives, including Hebb-like plasticity rules 9.54, fall semester 2014
Biophysics of Computation Traditional view (McCullogh Pitts, 1943 and Neural Nets, ~1980-2015): basic mechanism � • threshold mechanism of the spike � Dendritic computation (~1970): basic mechanisms (examples) � • passive : shunting inhibition • active: V and t dependent channels in dendrites • others � 9.54, fall semester 2014
Threshold units • Threshold units are universal � • The threshold mechanism can be identified with spike generation in the soma of a neuron
Threshold units are universal
Threshold units can be identified with neuron’s spike mechanisms Hodgkin-Huxley equations Leaky-integrator approximation
Perceptrons and Neural networks • These systems - not Boolean — are also universal in the sense of ability of approximating any continuous function (polynomial are dense in the space of continuous functions) • Active properties of neurons can also implement
Biophysics of Computation Traditional view (McCullogh Pitts, 1943 and Neural Nets, ~1980-2015): basic mechanism � • threshold mechanism of the spike � Dendritic computation (~1970): basic mechanisms (examples) � • passive : shunting inhibition • active: V and t dependent channels in dendrites • others � 9.54, fall semester 2014
Katz Miledi 1964
Relative motion 11 Towards the neural circuitry, Reichardt, Poggio, Hausen, 1983
Relative motion: feedforward model 12 Towards the neural circuitry, Reichardt, Poggio, Hausen, 1983
The circuit uses normalization (pre-Heeger), gain control and max-like operation ( x i ) r y i = ∑ N β + ( x j ) q j = 1 where y are the outputs after shunting inhibition, x are the inputs and r, q are approximations of pre-postsynaptic nonlinearities 13 Towards the neural circuitry, Reichardt, Poggio, Hausen, 1983
Katz Miledi 1964
Biophysics of Computation Background on neurons and synapses (many slides from a course by Rao; see 9.40) � � � 9.54, fall semester 2014
18
Dendritic Computation
Passive (linear) cable
General solution
Biophysics of Computation Dendritic computations � • passive nonlinearities: shunting inhibition • active nonlinearities: V and t dependent channels in dendrites � � 9.54, fall semester 2014
e3 ANDNOT (i1 OR i2 OR i3)] OR [e2 ANDNOT (i1 OR i2) OR (e1 ANDNT i1 ) (e1 ANDNOT i1) OR (e2 ANDNOT i2 ) OR {[(e3 ANDNOT i3) OR (e4 ANDNOT i4) OR (e6 AND-NOT i6) OR (e6 AND-NOT i6)] ANDNOT i7 }
Biophysics of Computation Dendritic computations � • passive nonlinearities: shunting inhibition • active nonlinearities: V and t dependent channels in dendrites � � 9.54, fall semester 2014
Biophysics of Computation Background on active membranes and spikes � � � 9.54, fall semester 2014
New model for CS cells (see later in class)
Traditional circuits for simple and complex cells since HW
How to compile into one cell
� Plasticity and Learning: Adapting the Connections � We will see in the next few classes on supervised learning how synaptic weights can be modifies during training to solve useful tasks. � But…how does the brain modify synaptic weights? What are the biophysical mechanisms?
Learning algorithms and biophysical mechanisms � Unsupervised Learning - Synapses adapted based solely on inputs - Network self-organizes in response to statistical patterns in input - Similar to Probability Density Estimation in statistics � Supervised Learning - Synapses adapted based on inputs and desired outputs - External “teacher”provides desired output for each input - Goal: Function approximation � Reinforcement Learning - Synapses adapted based on inputs and (delayed) reward/punishment - Goal: Pick outputs that maximize total expected future reward - Similar to optimization based on Markov decision processes 74
Biophysical mechanisms for learning � • Hebb rule for unsupervised learning � • Hebb rule + supervised modulation of neural threshold and gain for supervised learning � • Dopamine machinery for RL � 75
Hebb rule + supervised modulation of neural threshold and gain for supervised learning � Hebb rule � � � � with normalization � � � decreases error � with 76
LTP and LTD 77
LTP and LTD 9.54, fall semester 2014
LTP and LTD 9.54, fall semester 2014
LTP
LTP
LTP
STDP
STDP
STDP
STDP
STDP
STDP 9.54, fall semester 2014
STDP
STDP
STDP
STDP
STDP
STDP
STDP
9.54, fall semester 2014
Recommend
More recommend