laterally connected lobe component
play

Laterally Connected Lobe Component Analysis: Precision and - PowerPoint PPT Presentation

Laterally Connected Lobe Component Analysis: Precision and Topography Matt Luciw Juyang Weng Embodied Intelligence Laboratory Department of Computer Science and Engineering Michigan State University East Lansing, MI Enabling Emergent Internal


  1. Laterally Connected Lobe Component Analysis: Precision and Topography Matt Luciw Juyang Weng Embodied Intelligence Laboratory Department of Computer Science and Engineering Michigan State University East Lansing, MI

  2. Enabling Emergent Internal Representation • Internal representation for non-symbolic agents • Emergent : develops from experience • Efficient : utilize limited resource • Effective : leads to good performance • How must learning mechanisms adapt throughout the time course of development?

  3. Cortex-Inspired Multilayer Two-Way Networks • Level of modeling Adapted from Kandel, Schwartz and Jessell 2000 • Firing rate model • Explicitly weighted connected neurons • Hebbian learning (LCA) • Pathway development • Hierarchical, layered networks from sensors to motors • Sensors: pixels from camera • Motors control actions and behavior Visuomotor pathways in cortex • Three types of connections 1. Bottom-up 2. Top-down 3. Lateral

  4. Context of This Work  Optimal synaptic weight learning: LCA (Weng and Zhang, WCCI 2006, Weng and Luciw, TAMD)  Top-down connections  Class-based grouping (Luciw and Weng, WCCI 2008)  Top-down connections and Time  Almost-perfect recognition in centered objects (Luciw and Weng, ICDL 2008)  This work: Extend LCA and MILN with adaptive lateral excitatory connections Michigan State University 4

  5. Related Work: Lateral Connections  SOM (Kohonen)  Isotropic updating  Scope and learning rates manually tuned  LISSOM (Miikulainen et al.)  Explicit lateral excitatory and inhibitory connections  Learning rates manually tuned  MILN (Weng et al.)  ``Growing cortex’’ (scheduled growth times)  Optimal LCA: Automatic tuning of learning rates  LC-LCA within MILN (this work)  Explicit lateral excitatory connections  Using optimal LCA framework  Including top-down connections Michigan State University 5

  6. Motor and Somatosensory Organization • Primary cortical areas are organized topographically. Michigan State University 6

  7. FFA and PPA Areas Tootell, et al. fMRI mapping of a morphed continuum of 3D shapes within inferior temporal cortex, 2007

  8. V1 Connectivity Buzas, et al. Model-based analysis of excitatory lateral connections in visual cortex, 2006

  9. Lateral Excitation: Conflicting Criteria  Early stages: the brain must organize more globally  Critical for generalization with limited connections  Mechanism: Isotropic updating  Later stages: the brain must fine-tune its representation  Critical for superior performance  Lateral excitation mechanisms for both?  Isotropic updating: organized but not precise  Neurons do not excite (interfere with) one another: precise but not organized  Solution: adaptive lateral connections Figure from Weng, Luwang, Lu and Xue, 2007

  10. LC-LCA Algorithm

  11. Network Computation 1. Neurons Compute:

  12. Lateral Inhibition 2. Neurons Compete: 1. Rank neuron pre-responses 2. Top- k are scaled and will update 3. Others do not fire • Efficient approximation: replaces iterations • Simplifies lateral dynamics, but not biologically plausible • Sparse coding emerges

  13. LCA Updating 3. Optimal Hebbian Learning • Optimal adaptation of winners Hebbian adaptation, given stimulus (pre-synaptic activity), and firing (post-synaptic activity) Learning rates are automatically tuned  Each converges to the principal component of its observations  Minimize representational error in a mean-square sense

  14. LCA Weight Development  Neurons weight are roughly the expectation of their response-weighted input  For lateral weights: i j Neuron i fires Michigan State University 14

  15. Previous Method: 3 x 3 Updating  Some neurons converge to specialize in low- probability parts of the input space Michigan State University 15

  16. Effect of Adaptive Lateral Connections: Intuitive (?) Example No lateral excitation: unbalanced Isotropic updating: balanced but wasteful Adaptive Michigan State University 16

  17. Relative Levels of Modeling  Bottom-up Connections (+): level l  Top-down Connections (+): level l  Lateral Connections (-): higher than l  Before: Lateral Connections (+): higher than l  Now: Lateral Connections (+): level l Michigan State University 17

  18. Experiments  MSU 25-Objects Dataset: 25 Classes  200 images per class: 3D rotation  4/5 training data, 1/5 testing data  Grayscale

  19. Network Setup Imposed (Left): Training (Learning) (Below): Testing Communication of top firing

  20. Experiment Setup • 5 trials for each test • Each trial trained for 25,000 image/label pairs • Error: disjoint testing samples  Developmental Scheduling  No adaptation of lateral weights for 500 t  Schedule the number of winners (K)  Compared:  3x3 with top-down  3x3 without top-down  LC-LCA with top-down  LC-LCA without top-down

  21. Results: Recognition Rate

  22. Results: Neuronal Entropy • If entropy is zero, neuron i fires for samples from a single class.

  23. Topographic Organization Comparison

  24. Comparison of Updating Methods LCA: SOM: LISSOM: Michigan State University 24

  25. Results: Comparison of Updating Methods Michigan State University 25

  26. Conclusion  Laterally Connected Lobe Component Analysis  Smoothness and Precision are conflicting criteria  Mitigate through adaptive lateral connections  Developmental scheduling  Integrated networks with bottom-up, lateral, and top-down connections  LCA’s optimal update leads to more stable performance  Future directions  Locally connected laterally connected networks  Adaptive lateral connections in what/where networks Michigan State University 26

  27. Michigan State University 27

Recommend


More recommend