brane brain neuroscience superstring theory
play

Brane Brain (Neuroscience) (Superstring theory) 2 Deep Learning cat - PowerPoint PPT Presentation

KIAS, 26 March, 2018 Cquest, Sogang u., 29 March, 2018 MIT, CTP, 4 Apr, 2018 MPI, AEI, 13 Apr, 2018 H ET group, Osaka, 30 May, 2018 DLAP2018 workshop, 1 June, 2018 Deep Learning and AdS/CFT Koji Hashimoto (Osaka u) ArXiv:1802.08313 w/ S.


  1. KIAS, 26 March, 2018 Cquest, Sogang u., 29 March, 2018 MIT, CTP, 4 Apr, 2018 MPI, AEI, 13 Apr, 2018 H ET group, Osaka, 30 May, 2018 DLAP2018 workshop, 1 June, 2018 Deep Learning and AdS/CFT Koji Hashimoto (Osaka u) ArXiv:1802.08313 w/ S. Sugishita (Osaka), A. Tanaka (RIKEN AIP), A. Tomiya (CCNU)

  2. Brane Brain (Neuroscience) (Superstring theory) 2

  3. Deep Learning “cat” AdS/CFT [Maldacena ‘97] Black CFT AdS hole

  4. 1. Formula`on of AdS/DL correspondence 2. Implementa`on of AdS/DL and emerging space 4

  5. 1. Formula`on of AdS/DL correspondence 1-1 Solving inverse problem AdS/CFT: quantum response from geometry review Deep learning: op=mized sequen=al map review 1-2 From AdS to DL 1-3 Dic=onary of AdS/DL correspondence

  6. 1-1 Solving inverse problem AdS/CFT Conven`onal (No proof, no deriva`on) holographic modeling Model Classical gravity Metric in d+1 dim. space`me g µ ν || Predic`on Predic`on Quantum field theory Comparison in d dim. space`me (Strong coupling limit) Experiment Experiment data data

  7. 1-1 Solving inverse problem Our deep learning Conven`onal holographic modeling holographic modeling Model Model Metric Metric g µ ν g µ ν Predic`on Predic`on Predic`on Comparison Experiment Experiment Experiment Experiment data data data data

  8. AdS/CFT: quantum response from geometry review [Klebanov, Wiken] Classical scalar field theory in (d+1) dim. geometry � � d d +1 x ( ∂ η φ ) 2 − V ( φ ) � � S = − det g ds 2 = − f ( η ) dt 2 + d η 2 + g ( η )( dx 2 1 + · · · + dx 2 d − 1 ) AdS boundary ( ) : f ∼ g ∼ exp[2 η /L ] η ∼ ∞ Black hole horizon ( ) : f ∼ η 2 , g ∼ const. η ∼ 0 Solve EoM, get response . Boundary condi`ons: � O � J AdS boundary ( ) : η ∼ ∞ 1 φ = Je − ∆ − η + � O � e − ∆ + η ∆ + � ∆ − Black hole horizon ( ) : � ∂ η φ η =0 = 0 η ∼ 0 � 8

  9. Deep learning : op=mized sequen=al map review Layer 2 Layer N Layer 1 x ( N ) x (1) x (2) i = ϕ ( W (1) ij x (1) j ) i i F = f i x ( N ) i W (1) ϕ ( x ) ij “Ac`va`on func`on” (fixed nonlinear fn.) “Weights” (variable linear map) { x (1) 1) Prepare many sets : input + output i , F } 2) Train the network (adjust ) by lowering W ij “Loss func`on” � � � f i ( ϕ ( W ( N − 1) ϕ ( · · · ϕ ( W (1) � lm x (1) � � m )))) − F E ≡ � � ij � data

  10. 1-2 From AdS to DL η φ + h ( η ) ∂ η φ − δ V [ φ ] Bulk EoM ∂ 2 = 0 δφ � � � metric f ( η ) g ( η ) d − 1 h ( η ) ≡ ∂ η log Discre`za`on, Hamilton form φ ( η + ∆ η ) = φ ( η ) + ∆ η π ( η ) � h ( η ) π ( η ) − δ V ( φ ( η )) � π ( η + ∆ η ) = π ( η ) + ∆ η δφ ( η ) Neural-Network representa`on φ π π ( η = 0) η η = 0 η = ∞ 10

  11. 1-2 From AdS to DL η φ + h ( η ) ∂ η φ − δ V [ φ ] Bulk EoM ∂ 2 = 0 δφ � � � metric f ( η ) g ( η ) d − 1 h ( η ) ≡ ∂ η log Discre`za`on, Hamilton form φ ( η + ∆ η ) = φ ( η ) + ∆ η π ( η ) � h ( η ) π ( η ) − δ V ( φ ( η )) � π ( η + ∆ η ) = π ( η ) + ∆ η δφ ( η ) Neural-Network representa`on φ π � η =0 = 0 π � η η = 0 η = ∞ 11

  12. 1-3 Dic=onary of AdS/DL correspondence AdS/CFT Deep learning Emergent space Depth of layers ∞ > η ≥ 0 i = 1 , 2 , · · · , N Bulk gravity metric Network weights W ( a ) h ( η ) ij Nonlinear response Input data x (1) � O � J i Horizon condi`on Output data � ∂ η φ η =0 = 0 F � Interac`on Ac`va`on func`on ϕ ( x ) V ( φ ) 12

  13. 1. Formula`on of AdS/DL correspondence 1-1 Solving inverse problem Deep learning : op=mized sequen=al map review AdS/CFT: quantum response from geometry review 1-2 From AdS to DL 1-3 Dic=onary of AdS/DL correspondence

  14. 1. Formula`on of AdS/DL correspondence 2. Implementa`on of AdS/DL and emerging space 14

  15. 2. Implementa`on of AdS/DL and emerging space 2-1 Emergent geometry in deep learning 2-2 Can AdS Schwarzschild be learned? 2-3 Emergent space from real material? 2-4 Numerical experiment summary 2-5 Machines learn…, what do we learn?

  16. 2-1 Emergent geometry in deep learning Experiment 1: “Can AdS Schwarzschild be learned?” 1) Use AdS Schwarzschild and generate input data. 2) Prepare network with unspecified metric. 3) Let the network learn it by the data. 4) Check if AdS Schwarzschild is reproduced. Experiment 2: “Emergent space from real material?” 1) Use material experimental data. Ex) Magne`za`on curve of strongly correlated material 2) 3) (same as above.) 4) Watch how space emerges! 16

  17. 2-2 Exp1: Can AdS Schwarzschild be learned? 1) Use AdS Schwarzschild and generate input data. 2) Prepare network with unspecified metric. 3) Let the network learn it by the data. 4) Check if AdS Schwarzschild is reproduced. η φ + h ( η ) ∂ η φ − δ V [ φ ] ∂ 2 = 0 δφ V [ φ ] = − φ 2 + 1 4 φ 4 h ( η ) = 3 coth(3 η ) AdS Schwarzschild metric in the unit of AdS radius L = 1 17

  18. 2-2 Exp1: Can AdS Schwarzschild be learned? 1) Use AdS Schwarzschild and generate input data. 2) Prepare network with unspecified metric. 3) Let the network learn it by the data. 4) Check if AdS Schwarzschild is reproduced. φ ( η + ∆ η ) = φ ( η ) + ∆ η π ( η ) � h ( η ) π ( η ) − δ V ( φ ( η )) � π ( η + ∆ η ) = π ( η ) + ∆ η δφ ( η ) φ π � η =0 = 0 π � η η = 0 η = ∞ 18

  19. 2-2 Exp1: Can AdS Schwarzschild be learned? 1) Use AdS Schwarzschild and generate input data. 2) Prepare network with unspecified metric. 3) Let the network learn it by the data. 4) Check if AdS Schwarzschild is reproduced. Horizon condi`on π input : true : false φ input 19

  20. 2-2 Exp1: Can AdS Schwarzschild be learned? 1) Use AdS Schwarzschild and generate input data. 2) Prepare network with unspecified metric. 3) Let the network learn it by the data. 4) Check if AdS Schwarzschild is reproduced. φ input π input � η =0 = 0 π � Unspecified metric (10 layers, to be trained) Generated data from AdS Schwarzschild (10000 data points) 20

  21. 2-2 Exp1: Can AdS Schwarzschild be learned? 1) Use AdS Schwarzschild and generate input data. 2) Prepare network with unspecified metric. 3) Let the network learn it by the data. 4) Check if AdS Schwarzschild is reproduced. 21

  22. 2-2 Exp1: Can AdS Schwarzschild be learned? 1) Use AdS Schwarzschild and generate input data. 2) Prepare network with unspecified metric. 3) Let the network learn it by the data. 4) Check if AdS Schwarzschild is reproduced. With a regulariza`on 22

  23. 2-1 Emergent geometry in deep learning Experiment 1: “Can AdS Schwarzschild be learned?” 1) Use AdS Schwarzschild and generate input data. 2) Prepare network with unspecified metric. 3) Let the network learn it by the data. 4) Check if AdS Schwarzschild is reproduced. Experiment 2: “Emergent space from real material?” 1) Use material experimental data. Ex) Magne`za`on curve of strongly correlated material 2) 3) (same as above.) 4) Watch how space emerges! 23

  24. 2-3 Exp2: Emergent space from real material? 1) Use material experimental data. Ex) Magne`za`on curve of strongly correlated material 2) 3) (same as above.) 4) Watch how space emerges! 24

  25. 2-4 Numerical experiment summary Experiment 2 Experiment 1 Experimental data is AdS Schwarzschild is explained by emergent successfully learned. space. 25

  26. 2-5 Machines learn…, what do we learn? Our deep learning Conven&onal holographic modeling � holographic modeling � Model � Model � Metric � Metric � g µ ν g µ ν Predic&on Predic&on Predic&on Comparison � Experiment Experiment Experiment Experiment data � data � data � data � 26

  27. 1. Formula`on of AdS/DL correspondence 2. Implementa`on of AdS/DL and emerging space 27

Recommend


More recommend