limits its of nume meric rical al appr proache oaches
play

Limits its of nume meric rical al appr proache oaches FACET - PowerPoint PPT Presentation

N EUROSCIE SCIENTIFI IFIC MODELIN ING WITH LARGE - SCA SCALE AND HIGHLY ACCELERATED TED NEUROMOR MORPHI HIC HARDWARE DEVICE CES The FACETS Project Mihai A. Petrovici, University of Heidelberg Electronic Vision(s) Group P ART ART I A N


  1. N EUROSCIE SCIENTIFI IFIC MODELIN ING WITH LARGE - SCA SCALE AND HIGHLY ACCELERATED TED NEUROMOR MORPHI HIC HARDWARE DEVICE CES The FACETS Project Mihai A. Petrovici, University of Heidelberg Electronic Vision(s) Group

  2. P ART ART I A N INTRODUCTIO TION TO TO THE FACET CETS S NEUROMOR ORPHI HIC HARDWARE

  3. Limits its of nume meric rical al appr proache oaches

  4. FACET CETS S neur urom omorp orphic ic hard rdware are Spike ikey y - 2006: 2006: 384 neurons 10 5 synapses

  5. FACET CETS S neur urom omorp orphic ic hard rdware are Spike ikey y - 2006: 2006: 384 neurons 10 5 synapses HICANN NN - 2010 2010 512 neurons 1.3  10 5 synapses

  6. FACET CETS S neur urom omorp orphic ic hard rdware are Waf afer er - 2011: 2011: 16  10 4 neurons 4  10 7 synapses Spike ikey y - 2006: 2006: 384 neurons 10 5 synapses HICANN NN - 2010: 2010: 512 neurons 1.3  10 5 synapses

  7. FACET CETS S neur urom omorp orphic ic hard rdware are Waf afer er - 2011: 2011: 16  10 4 neurons 4  10 7 synapses Spike ikey y - 2006: 2006: 384 neurons 10 5 synapses Rac ack – 20??: HICANN NN - 2010: 2010: 16  10 5 neurons 512 neurons 4  10 8 synapses 1.3  10 5 synapses

  8. Hardw rdware are vs. biology logy up to 10 5 speedup Biological gical neural ral comp mputat tation FACET ACETS wafer-scale scale hard rdwar ware 10 11 neurons, 10 15 synapses 10 5 Neurons, 10 7 Synapses Connectivity 10.000 synapses per neuron arbitrarily configurable vast range of neuron multi-compartment Diversity categories and parameters Adaptive Exponential Integrate and Fire neurons long term, short term Short Term Plasticity Plasticity local, global Spike Timing Dependent Plasticity various time constants and delays adjustable time constants, but no on-wafer delays Timing modular, high bandwidth, low power, fault tolerant Scalability

  9. Neuro uron n model del of choice ice R. Naud et al.: Firing patterns in the adaptive-exponential integrate-and fire-model, BiolCybern(2008) 99:335 – 347 tonic adaptation spiking initial regular burst bursting delayed delayed regular accelerating bursting transient irregular spiking spiking

  10. CMOS OS implem emen entation tation of AdEx Ex neuron uron

  11. Wafe fer-sca scale le integr gration ation massive configuration space  dedicated mapping tools  versatile control software  distortion analysis and compensation  complex emulation workflow

  12. ? ? P ART ART II ( A ) W ORKFLO LOW : B IOLOG OGY - TO TO - HA HARDWA WARE MAPPING

  13. Mod odeling eling langu guage age

  14. Mod odeling eling langu guage age

  15. Softwa ftware re and hard rdware are layers ers

  16. Softwa ftware re and hard rdware are layers ers

  17. Biology logy-to to-hard ardware ware mappin ping Graph aph model el (TUD UD)

  18. Biology logy-to to-hard ardware ware mappin ping Graph aph model el (TUD UD)

  19. Hardw rdware are grap aph

  20. Biology logy-to to-hard ardware ware mappin ping Graph aph model el (TUD UD)

  21. Nforce orce cluster ster algor orithm ithm

  22. Placing cing opti timiz mizatio ation

  23. Mapping pping algor orithm ithm perfor rforman mance ce

  24. ? P ART II (B) ? W ORKFLO LOW : D IST STOR ORTION ION EVALUATION ON AND COMPENSA NSATION ION

  25. Attra tractor ctor memory ory schem ematic atic

  26. Spiki king g patter terns ns

  27. Trajector ajectories es in voltage age space ce

  28. Networ twork k dynamics amics

  29. Networ twork k dynamics amics Motiv tivatio ation - hardware imperfections - nonisomorphic simulation/emulation environments e.g. neuron model, digitized weights, … - mapping/routing losses robustness is an essential characteristic of biological neural networks  hardware independent research Relev evant ant para rame meter ters - STP - adaptation - number of MC per HC model- model- - delays - number of HC independent specific - synaptic weights - total number of MC (  network size) - neuron loss - synapse loss

  30. The e importan ortance ce of STP with STP (Poisson input: 4 kHz) without STP (Poisson input: 1 kHz)

  31. The e importan ortance ce of adap aptatio tation n and delays ays + adaptation + delays + adaptation - delays - adaptation - delays mean firing rate in ON state: mean firing rate in ON state: mean firing rate in ON state: 30 Hz 28 Hz 116 Hz

  32. Dwel well l times s and neuro uron n loss

  33. Syna napse pse loss 0% loss 10% % loss 25% % loss 40% % loss

  34. Dwel well l times s and synapse apse loss 0 % 5 % 10 % 20 %

  35. Firin ing g rates es and d synap apse se loss

  36. Networ twork k scaling ing Relev evant ant para rame meter ters - STP - number of MC per HC - adaptation scaling may  model- - number of HC - delays model- influence specific - total number of MC - synaptic weights independent behavior ! (  network size) - neuron loss - synapse loss

  37. Networ twork k scaling ing Relev evant ant para rame meter ters - STP - number of MC per HC - adaptation scaling may  model- - number of HC - delays model- influence specific - total number of MC - synaptic weights independent behavior ! (  network size) - neuron loss - synapse loss Scaling ling throu rough gh modific ification tion of conn nnectio ection n probabili obabilitie ties 1 2 3 1 2 1 2 3 1 2 1 2 3

  38. Scaling ling and d rob obustne ustness ss 3 HC 3 MC 0% synapse loss 20% synapse loss

  39. Scaling ling and d rob obustne ustness ss 9 HC 9 MC 0% synapse loss 20% synapse loss

  40. Patter tern compl mpleti etion on stor ored ed image mages

  41. Spontan ontaneo eous pattern tern generati neration on

  42. Patter tern compl mpleti etion: on: small l distor tortio tion input put image

  43. Patter tern compl mpleti etion: on: large ge distorti tortion on input put image

  44. Patter tern compl mpleti etion: on: two o patter terns input put image

  45. Patter tern compl mpleti etion: on: a more e biological logical appr proach oach

  46. Synfire nfire chain ain schem ematic atic exc 100 regular spiking neurons same parameters in our model inh 25 fast spiking neurons

  47. Synfire nfire chain ain simulations lations

  48. Syna napse pse loss

  49. The e proble oblem m of limited ed input ut only 64 external inputs with max. 100 Hz / channel 4000 Hz independent for 192 neurons Poisson input per neuron

  50. The e proble oblem m of limited ed input ut Problem II Problem I given a limited set of input channels and a how to quantify and predict correlations minimum requirement for inputs per neuron, which arise from shared inputs ? can we find a corresponding mapping ?

  51. Singl gle e neuron uron beha havi vior or The The Load Function      syn  max ,     with  L       mem 0 - t exp t w t i i i L  L spikes i the neuron fires if thresh

  52. Statistic tistical al treatmen atment t of neural ural activity vity     N L ,  N M   2 , Gaussian distribution: , for example 1     L L two channels: shared and private s p     L   L  L    N L  L    2 2 ( ) ( ) ( ) , P a P x P a x dx 0 0 0 1 L L A s p s p s p   two neurons sharing inputs:   L  L   L  L   L   ( , ) ( ) ( ) ( ) P a b P x P a x P b x dx 0 0 0 0 A B s p p    multivariate normal distributions      L  L , : ( , ) P A B P a b numerical integration: thresh thresh        | , P A B P A B P B conditional probability:

  53. Symme metr tric ic Uncer ertainty tainty     ; I X Y   , 2 2 SU X Y R  ( ) ( ) H X H Y          p A B   ; log I A B p A B     p A p B       0 , 1 0 , 1 A B features: • symmetric in X and Y pure information theory  highly general • normalized: SU  [0,1]  allows comparison • over a wide range of spike train parameters • no free parameters ! • more than just synchrony

  54. Partial rtial derivati rivative ves V thresh = -55 mV simtime = 20 s w exc = w  0,5 nS V rest = -59 mV  mem = 5 ms

  55. The e mapp pping ing pro roblem blem n k total inputs common per neuron inputs per neuron pair N (=64/2) M (=192) total inputs total outputs k max  maximum  common  inputs per  neuron pair   for given N, n minimize k for given N, n (large), k max (small) or while keeping M  192 can we find enough subsets (M)?

Recommend


More recommend