introduction to artificial neural networks anns
play

Introduction to Artificial Neural Networks (ANNs) Keith L. Downing - PowerPoint PPT Presentation

Introduction to Artificial Neural Networks (ANNs) Keith L. Downing The Norwegian University of Science and Technology (NTNU) Trondheim, Norway keithd@idi.ntnu.no January 19, 2015 Keith L. Downing Introduction to Artificial Neural Networks


  1. Introduction to Artificial Neural Networks (ANNs) Keith L. Downing The Norwegian University of Science and Technology (NTNU) Trondheim, Norway keithd@idi.ntnu.no January 19, 2015 Keith L. Downing Introduction to Artificial Neural Networks (ANNs)

  2. NETtalk (Sejnowski + Rosenberg, 1986) N E "Concepts" Letters Phonemes U C o R n O t e S x t C Silent W I i E n d N o w C E IBM’s DECtalk: several man years of work → Reading machine. NETtalk: 10 hours of backprop training on a 1000-word text, T1000. 95% accuracy on T1000; 78% accuracy on novel text. Improvement during training sounds like a child learning to read. Concept layer is key. 79 different (overlapping) clouds of neurons are gradually formed, with each mapping to one of the 79 phonemes. Keith L. Downing Introduction to Artificial Neural Networks (ANNs)

  3. Sample ANN Applications: Forecasting Train the ANN (typically using backprop) on historical data to learn 1 [ X ( t − k ) , X ( t − k + 1 ) ,..., X ( t 0 )] �→ [ X ( t 1 ) ,..., X ( t m − 1 ) , X ( t m )] Use to predict future value(s) based on the past k values. 2 Sample applications (Ungar, in Handbook of Brain Theory and NNs , 2003) Car sales Airline passengers Currency exchange rates Electrical loads on regional power systems. Flour prices Stock prices (Warning: often tried, but few good, documented results). Keith L. Downing Introduction to Artificial Neural Networks (ANNs)

  4. Brain-Computer Interfaces (BCI) Scalp EEG Neural Ensembles Neural Action Context 1 Ask subject to think about an activity (e.g. moving joystick left) 2 Register brain activity (EEG waves - non-invasive) or (Neural ensembles - invasive) ANN training case = (brain readings, joystick motion) 3 Sample applications (Millan, in Handbook of Brain Theory and NNs , 2003) Keyboards (3 keystrokes per minute) Artificial (prosthetic) hands Wheelchairs Computer games Keith L. Downing Introduction to Artificial Neural Networks (ANNs)

  5. Brains as Bio-Inspiration Texas "Watermelon" "The truth? You can't handle the truth." "I got a 69 Chevy with a 396..." Grandmother Distributed Memory - A key to the brain’s success, and a major difference between it and computers. Brain operations slower than computers, but massively parallel. How can the brain inspire AI advances? What is the proper level of abstraction? Keith L. Downing Introduction to Artificial Neural Networks (ANNs)

  6. Signal Transmission in the Brain SP Axons Nucleus Dendrites AP Action Potential (AP) A wave of voltage change along an axon. Nucleus (soma) generates an AP if the sum of its incoming synaptic potentials SPs (similar, but weaker, voltage change along dendrites) is strong enough. Unlike neuroscientists, AI people rarely distinguish between APs and SPs. Both are just signals. Keith L. Downing Introduction to Artificial Neural Networks (ANNs)

  7. Ion Channels K+ Repolarization K+ Ca++ Na+ Ca++ Na+ Depolarization Keith L. Downing Introduction to Artificial Neural Networks (ANNs)

  8. Depolarization and Repolarization Overshoot +40 mV K+ gates opens Na+ gates close K+ Efflux K+ Efflux 0 mV Na+ gates open K+ gates close Na+ Influx -65 mV Resting Potential Time Undershoot Keith L. Downing Introduction to Artificial Neural Networks (ANNs)

  9. Transferring APs across a Synapse Presynaptic Postsynaptic Terminal Terminal Synapse NT-gated Vesicle Ion Channel Neurotransmitter (NT) Action-Potential (AP) Neurotransmitters Excite - Glutamate, AMPA ; bind Na+ and Ca++ channels. Inhibit - GABA; binds K+ channels Keith L. Downing Introduction to Artificial Neural Networks (ANNs)

  10. Location, Location, Location..of Synapses I1 P2 Axons Soma Dendrites P1 I2 Distal and Proximal Synapses Synapses closer to the soma normally have a stronger effect. Keith L. Downing Introduction to Artificial Neural Networks (ANNs)

  11. Donald Hebb (1949) Fire Together, Wire Together When an axon of cell A is near enough to excite a cell B and repeatedly or persistently takes part in firing it, some growth process or metabolic change takes place in one or both cells, such that A’s efficiency as one of the cells firing B, is increased. Hebb Rule △ w i , j = λ o i o j Instrumental in Binding of.. pieces of an image words of a song multisensory input (e.g. words and images) sensory inputs and proper motor outputs simple movements of a complex action sequence Keith L. Downing Introduction to Artificial Neural Networks (ANNs)

  12. Coincidence Detection and Synaptic Change 2 Key Synaptic Changes the propensity to release neurotransmitter (and amount 1 released) at the pre-synaptic terminal, the ease with which the post-synaptic terminal depolarizes 2 in the presence of neurotransmitters. Coincidences Pre-synaptic: Adenyl cyclase (AC) detects simultaneous 1 presence of Ca++ and serotonin. Post-synaptic: NMDA receptors detect co-occurrence of 2 glutamate (a neurotransmitter) and depolarization. Keith L. Downing Introduction to Artificial Neural Networks (ANNs)

  13. Pre-synaptic Modification Depolarization Salient Event Ca++ Ca++ AC 5HT ATP cAMP Pre-synaptic Post-synaptic Terminal Terminal PKA Glutamate NMDA Receptor Mg++ Adenyl Serotonin AC 5HT Cyclase Keith L. Downing Introduction to Artificial Neural Networks (ANNs)

  14. Post-synaptic Modification Polarized (relaxed) postsynaptic state Net CA++ Mg++ Negative Charge NMDA Receptor Depolarized (firing) Glutamate postsynaptic state Net Positive CA++ Charge Mg++ Keith L. Downing Introduction to Artificial Neural Networks (ANNs)

  15. Neurochemical Basis of Hebbian Learning Fire together: When the pre- and post-synaptic terminal of a synapse depolarize at about the same time, the NMDA channels on the post-synaptic side notice the coincidence and open, thus allowing Ca++ to flow into the post-synaptic terminal. Wire together: Ca++ (via CaMKII and protein kinase C) promotes post- and pre-synaptic changes that enhance the efficiency of future AP transmission. Keith L. Downing Introduction to Artificial Neural Networks (ANNs)

  16. Hebbian Basis of Classical Conditioning Hear Bell(CS) S2 S1 Salivate (R) See Food (US) Unconditioned Stimulus (US) - sensory input normally associated with a response (R). E.g. the sight of food stimulates salivation. Conditioned Stimulus (CS) - sensory input having no previous correlation with a response but which becomes associated with it. E.g. Pavlov’s bell. Keith L. Downing Introduction to Artificial Neural Networks (ANNs)

  17. Long-Term Potentiation (LTP) Early Phase Chemical changes to pre- and post-synaptic terminals, due to AC and NMDA activity, respectively, increase the probability (and efficiency) of AP transmission for minutes to hours after training. Late Phase Structural changes occur to the link between the upstream and downstream neuron. This often involves increases in the numbers of axons and dendrites linking the two, and seems to be driven by chemical processes triggered by high concentrations of Ca++ in the post-synaptic soma. Keith L. Downing Introduction to Artificial Neural Networks (ANNs)

  18. Abstraction Human Brains 10 11 neurons 10 14 connections between them (a.k.a. synapses), many modifiable Complex physical and chemical activity to transmit ONE action potential (AP) (a.k.a. signal) along ONE connection. Artificial Neural Networks N = 10 1 − 10 4 nodes Max N 2 connections All physics and chemistry represented by a few parameters associated with nodes and arcs. Keith L. Downing Introduction to Artificial Neural Networks (ANNs)

  19. Structural Abstraction w w Node Node Node w w w Node w Node w Node Soma Axonal Compartments Dendritic Soma Compartments Soma Soma Soma Soma Synapses Dendrites Axons Soma AP AP Soma Soma Keith L. Downing Introduction to Artificial Neural Networks (ANNs)

  20. Diverse ANN Topologies A C B E D F Keith L. Downing Introduction to Artificial Neural Networks (ANNs)

  21. Functional Abstraction Learn N1 N2 w12 Integrate Activate w13 N3 Reset RK RNa VM CM EK ENa K+ Na+ Lipid Ion channel bilayer = = resistor capacitor K+ K+ Ca++ Na+ Ca++ Na+ Keith L. Downing Introduction to Artificial Neural Networks (ANNs)

  22. Main Functional Components Learn N1 N2 w12 Integrate Activate w13 N3 Reset Integrate net i = ∑ n j = 1 x j w i , j : V i ← V i + net i Activate 1 x i = 1 + e − Vi Reset V i ← 0 Learn △ w i , j = λ x i x j Keith L. Downing Introduction to Artificial Neural Networks (ANNs)

  23. Functional Options Activate Integrate xi Vi <= Vi + neti Vi xi Vi i Reset Never reset Vi Spiking Neuron Model : Reset Vi only when above threshold Neurons without state: Always reset Vi Keith L. Downing Introduction to Artificial Neural Networks (ANNs)

  24. Activation Functions x i = f ( V i ) Step Identity 1 1 xi xi 0 0 Vi Vi T Ramp 1 xi 0 Vi T Hyperbolic Tangent (tanh) Logistic 1 1 xi xi -1 0 0 Vi Vi 0 Keith L. Downing Introduction to Artificial Neural Networks (ANNs)

  25. Diverse Model Semantics What Does x i Represent? The occurrence of a spike in the action potential, 1 The instantaneous membrane potential of a neuron, 2 The firing rate of a neuron (AP’s / sec), 3 The average firing rate of a neuron over a time window, 4 The difference between a neuron’s current firing rate and 5 its average firing rate. Keith L. Downing Introduction to Artificial Neural Networks (ANNs)

Recommend


More recommend