fundamentals of computational neuroscience 2e
play

Fundamentals of Computational Neuroscience 2e January 1, 2010 - PowerPoint PPT Presentation

Fundamentals of Computational Neuroscience 2e January 1, 2010 Chapter 10: The cognitive brain Hierarchical maps and attentive vision A. Ventral visual pathway B. Layered cortical maps 50 Inferior Layer 4 Layer 4 Temporal cortex Receptive


  1. Fundamentals of Computational Neuroscience 2e January 1, 2010 Chapter 10: The cognitive brain

  2. Hierarchical maps and attentive vision A. Ventral visual pathway B. Layered cortical maps 50 Inferior Layer 4 Layer 4 Temporal cortex Receptive field size / deg 20 Posterior Layer 3 Layer 3 8.0 V4 } 3.2 V2 Cccipital cortex Layer 2 Layer 2 1.3 V1 LGN Thalamus Layer 1 Layer 1 0 1.3 3.2 8.0 20 50 Eccentricity / deg

  3. Attention in visual search and object recognition Visual search WHERE WHERE Given Given : : Given Given : : Particular Features Particular features Particular spatial location Particular Spatial Location ( ( target object Target Object ) ) ( ( target Target position) Position) Function Function : : Function Function : : Scanning Scanning Binding Binding ( ( Attentional Window Scanns the attentional window scans the ( ( Attentional Windo attentional window binds features Bind Features entire scene Entire Scene ) ) for Identification for identification ) ) WHAT WHAT Object Recognition Object recognition Gustavo Deco

  4. Model Top-down bias ( Object specific ) Inhibitory Top-down bias IT Pool ( Location specific ) ( Object recognition ) PP „ What “ ( Spatial location ) Locus attentional preferred „ Where “ Inhibitory pool V1 V4 .... (Feature extraction ) Inhibitory pool .... Inhibitory Gabor jets pool LGN Visual field

  5. Example results A. `Parallel search’ B. `Serial search’ X F F F F E X E E Number of items Number of items 1 2 3 1 2 2 3 3 Activity Activity PP PP Time Time

  6. The interconnecting workspace hypothesis Evaluative system (VALUE) Attentional Long-term system memory (Focusing) (PAST) Global workspace Motor system Perceptual (FUTURE) system (PRESENT) Stanislas Dehaene, M. Kergsberg, and J.P . Changeux, PNAS 1998

  7. Stroop task modelling REWARD (error signal) VIGILANCE WORKSPACE NEURONS attentional attentional amplification suppression of word of colour SPECIALIZED PROCESSORS INPUTS & OUTPUTS WORD COLOUR NAMING RESPONSE grey black black

  8. The anticipating brain 1. The brain can develop a model of the world, which can be used to anticipate or predict the environment. 2. The inverse of the model can be used to recognize causes by evoking internal concepts. 3. Hierarchical representations are essential to capture the richness of the world. 4. Internal concepts are learned through matching the brain’s hypotheses with input from the world. 5. An agent can learn actively by testing hypothesis through actions. 6. The temporal domain is an important degree of freedom.

  9. Outline Environment Agent PNS Sensation CNS Sensation Internal p ( s | c, a ) p ( s |s , c ) states External p ( a c | ) p c ( | s ,c ) p c ( | c ,c ) states p a ( | a , s ) p a ( |s , c ) PNS Action CNS Action

  10. Recurrent networks with hidden nodes The Boltzmann machine: Hidden nodes Visible nodes Energy: H nm = − 1 � ij w ij s n i s m j 2 Probabilistic update: p ( s n 1 i = + 1 ) = j w ij s n 1 + exp ( − β P j ) Boltzmann-Gibbs distribution: p ( s v ; w ) = 1 m ∈ h exp ( − β H vm ) � Z

  11. Training Boltzmann machine Kulbach-Leibler divergence v p ( s v ) � KL ( p ( s v ) , p ( s v ; w )) p ( s v ) log = p ( s v ; w ) s v v � � p ( s v ) log p ( s v ) − p ( s v ) log p ( s v ; w ) = s s Minimizing KL is equivalent to maximizing the average log-likelihood function v � p ( s v ) log p ( s v ; w ) = � log p ( s v ; w ) � . l ( w ) = s Gradient decent → Boltzmann Learning ∆ w ij = η ∂ l ∂ w ij = η β � � � s i s j � clamped − � s i s j � free . 2

  12. The restricted Boltzmann machine Hidden nodes Visible nodes Contrastive Hebbian learning: Alternating Gibbs sampling t=1 t=2 t=3 t= 8

  13. Deep generative models Concept input Recognition readout and stimulation RBM layers RBM/Helm- holtz layers Model retina Image input

  14. Adaptive Resonance Theory (ART) Attentional subsystem Gain Orienting control subsystem F 2 Reset u t w t w b − v s F 1 + − ρ g + + Input

  15. Further Readings Edmund T. Rolls and Gustavo Deco (2001), Computational neuroscience of vision , Oxford University Press. Karl Friston (2005), A theory of cortical responses , in Philosophical Transactions of the Royal Society B 360, 815–36. Jeff Hawkins with Sandra Blakeslee (2004), On intelligence , Henry Holt and Company. Robert Rosen (1985), Anticipatory systems: Philosophical, mathematical and methodological foundations , Pergamon Press. Geoffrey E. Hinton (2007), Learning Multiple Layers of Representation , in Trends in Cognitive Sciences 11: 428–434. Stephen Grossberg (1976), Adaptive pattern classification and universal recoding: Feedback, expectation, olfaction, and illusions , in Biological Cybernetics 23: 187–202. Gail Carpenter and Stephen Grossberg (1987), A massively parallel architecture for a self-organizing neural pattern recognition machine in Computer Vision, Graphics and Image Processing 37: 54–115. Daniel S. Levine (2000), Introduction to neural and cognitive modeling , Lawrence Erlbaum, 2nd edition. James A. Freeman (1994), Simulating neural networks with Mathematica , Addison-Wesley.

Recommend


More recommend