machine learning for bounce calculation
play

Machine learning for bounce calculation Ryusuke Jinno (IBS-CTPU) - PowerPoint PPT Presentation

Machine learning for bounce calculation Ryusuke Jinno (IBS-CTPU) Based on 1805.12153 2018/12/10 @ ICTP workshop on machine learning landscape 01 / 29 SELF INTRODUCTION Ryusuke ( ) Jinno ( ) - 2016/3 : Ph.D. @ Univ. of Tokyo -


  1. Machine learning for bounce calculation Ryusuke Jinno (IBS-CTPU) Based on 1805.12153 2018/12/10 @ ICTP workshop on machine learning landscape 01 / 29

  2. SELF INTRODUCTION Ryusuke ( 隆介 ) Jinno ( 神野 ) - 2016/3 : Ph.D. @ Univ. of Tokyo - 2016/9- : IBS-CTPU, Korea - 2016/4-8 : KEK, Japan - 2019/4- : DESY, Germany (planned) Research interests & recent works - Machine learning : Application of machine learning to QFT tunneling problem - Gravitational waves : Analytic approach to GW production in phase transitions - (P)reheating : Preheating in Higgs inflation (discovery of “spike preheating” phenomena) - Inflation : Hillcliming inflation (an inflationary attractor) Hillclimbing Higgs inflation (new realization of Higgs inflation with hillclimbing scheme) Ryusuke Jinno 02 / 29 1805.12153

  3. SELF INTRODUCTION Ryusuke ( 隆介 ) Jinno ( 神野 ) - 2016/3 : Ph.D. @ Univ. of Tokyo - 2016/9- : IBS-CTPU, Korea - 2016/4-8 : KEK, Japan - 2019/4- : DESY, Germany (planned) Research interests & recent works - Machine learning : Application of machine learning to QFT tunneling problem - Gravitational waves : Analytic approach to GW production in phase transitions - (P)reheating : Preheating in Higgs inflation (discovery of “spike preheating” phenomena) - Inflation : Hillcliming inflation (an inflationary attractor) Hillclimbing Higgs inflation (new realization of Higgs inflation with hillclimbing scheme) Ryusuke Jinno 02 / 29 1805.12153

  4. TALK PLAN 1. Machine learning : lightning introduction 2. Machine learning meets tunneling problem in QFT 3. Data taking / Machine setup / Training process / Results 4. Summary Ryusuke Jinno / 29 1805.12153

  5. MACHINE LEARNING: LIGHTNING INTRODUCTION Ryusuke Jinno [ https://www.slideshare.net/awahid/big-data-and-machine-learning-for-businesses ] 03 / 29 1805.12153

  6. MACHINE LEARNING: LIGHTNING INTRODUCTION Can I apply this technique to problems in high-energy physics? Ryusuke Jinno [ https://www.slideshare.net/awahid/big-data-and-machine-learning-for-businesses ] 03 / 29 1805.12153

  7. MACHINE LEARNING: LIGHTNING INTRODUCTION Terminology? Machines that can perform tasks Artificial intelligence (AI) that are characteristic of human intelligence [ J. McCarthy ] A way of achieving AI: Machine learning (ML) lerning without being explicitly programmed Machine learning with artificial neurons ( → later) Neural network (NN) w1 x1 w2 x2 Deep learning wn ... xn Deep neural network Neural network with deep (= many) layers of neurons Note : the speaker’s major is not machine learning !! e.g. [ https://en.wikipedia.org/wiki/Artificial_intelligence ] [ https://medium.com/iotforall/the-difference-between-artificial-intelligence-machine-learning-and-deep-learning-3aa67bff5991 ] Ryusuke Jinno 04 / 29 1805.12153

  8. LINEAR VS. NONLINEAR: SPAM EMAIL EXAMPLE Linear problem Question Your data : # of "luxuary" in the email Find such that x 2 a, b x 2 = ax 1 + b is the boundary of spam & not spam : spam : not spam : # of "discount" in the email x 1 Ryusuke Jinno 05 / 29 1805.12153

  9. LINEAR VS. NONLINEAR: SPAM EMAIL EXAMPLE Linear problem Question Your data : # of "luxuary" in the email Find such that x 2 a, b x 2 = ax 1 + b is the boundary of spam & not spam : spam : not spam Answer "Linear problem" → Good solution found easily : # of "discount" in the email x 1 Ryusuke Jinno 05 / 29 1805.12153

  10. LINEAR VS. NONLINEAR: SPAM EMAIL EXAMPLE Nonlinear problem Question Your data : # of "luxuary" in the email Find such that x 2 a, b x 2 = ax 1 + b is the boundary of spam & not spam : spam : not spam : # of "discount" in the email x 1 Ryusuke Jinno 06 / 29 1805.12153

  11. LINEAR VS. NONLINEAR: SPAM EMAIL EXAMPLE Nonlinear problem Question Your data : # of "luxuary" in the email Find such that x 2 a, b x 2 = ax 1 + b is the boundary of spam & not spam : spam : not spam : # of "discount" in the email x 1 Ryusuke Jinno 06 / 29 1805.12153

  12. LINEAR VS. NONLINEAR: SPAM EMAIL EXAMPLE Nonlinear problem Question Your data : # of "luxuary" in the email Find such that x 2 a, b x 2 = ax 1 + b is the boundary of spam & not spam : spam : not spam : # of "discount" in the email x 1 Ryusuke Jinno 06 / 29 1805.12153

  13. LINEAR VS. NONLINEAR: SPAM EMAIL EXAMPLE Nonlinear problem Question Your data : # of "luxuary" in the email Find such that x 2 a, b x 2 = ax 1 + b is the boundary of spam & not spam : spam : not spam Answer "Nonlinear problem" → No good solution : # of "discount" in the email x 1 Ryusuke Jinno 06 / 29 1805.12153

  14. LINEAR VS. NONLINEAR: SPAM EMAIL EXAMPLE Nonlinear problem q With some effort, you may find & useful: x 2 1 + x 2 θ = arctan( x 2 /x 1 ) r = 2 θ r Ryusuke Jinno 07 / 29 1805.12153

  15. LINEAR VS. NONLINEAR: SPAM EMAIL EXAMPLE Nonlinear problem q With some effort, you may find & useful: x 2 1 + x 2 θ = arctan( x 2 /x 1 ) r = 2 θ Good solution found, but... "Feature engineering" - In this specific case you are successful - But you cannot always find such good quantities - Any good strategy to this kind of problem? (i.e. to capture nonlinearity) r Ryusuke Jinno 07 / 29 1805.12153

  16. NEURAL NETWORK ? [ https://medium.com/autonomous-agents/ Biological neuron mathematical-foundation-for-activation-functions-in-artificial-neural-networks-a51c9dd7c089 ] synapse axon neuron electric synapse signal 1. Each neuron collects electric signals through synapses 2. When the total signal exceeds a threshold, electric signal is sent to next neuron through axon Ryusuke Jinno 08 / 29 1805.12153

  17. NEURAL NETWORK ? Artificial neuron mimics biological neuron nonlinear x 1 w 1 function output Diagramatic notation w 2 X f x 2 x i w i z w n sum weight x n input : weight f ( y ) w i ⇢ ⇣X ⌘ : bias Equation z = f x i w i + b b f : ReLU (rectified linear unit) y Ryusuke Jinno 09 / 29 1805.12153

  18. NEURAL NETWORK ? Neural network = network of artificial neurons = neuron neural = network Ryusuke Jinno 10 / 29 1805.12153

  19. NEURAL NETWORK ? Neural network = network of artificial neurons x 1 = f ( W 1 x in + b 1 ) ⇢ x n = f ( W n x n − 1 + b n ) (2 ≤ n ≤ N ) x out = W out x N + b out f ( y 1 )   = matrix / = vector / = vector b n W n x n f ( y 2 )   Note2 : Note1 : f ( y ) =   · · ·   W n x n − 1 + b n = ( W n ) ij ( x n − 1 ) j + ( b n ) i f ( y n ) Ryusuke Jinno 11 / 29 1805.12153

  20. NEURAL NETWORK: SUPERVISED LEARNING How to train the neural network with "supervised learning" - Suppose we have many data of ( x in , x (true) ) out - Then we can define "how poorly the machine predicts" e.g. � � � ( x out ) i − ( x (true) X X Error function E = ) i � � out � data i :component - Training of neural network = update of weights and biases using b E W b → b − α∂ E W → W − α ∂ E α : constant ∂ b ∂ W Note : there are more sophisticated algorithms, e.g. AdaGrad, Adam, ... Ryusuke Jinno 12 / 29 1805.12153

  21. NEURAL NETWORK: HOW POWERFUL? Ability of neural network to capture nonlinearity - My data : 100 points sampled from nonlinear function ( x in , x (true) ) = ( x in , x in ( x in − 0 . 3)( x in − 0 . 6)( x in − 0 . 9)) out - My neural network : 2 layers, 20 neurons per each, trained for 10 sec. Pred Ans x (true) or 0.03 0.03 x out out 0.02 0.02 0.01 0.01 1.0x 1.0x x in 0.2 0.4 0.6 0.8 0.2 0.4 0.6 0.8 - 0.01 - 0.01 Ryusuke Jinno 13 / 29 1805.12153

  22. NEURAL NETWORK: HOW POWERFUL? Ability of neural network to capture nonlinearity - My data : 100 points sampled from nonlinear function ( x in , x (true) ) = ( x in , x in ( x in − 0 . 3)( x in − 0 . 6)( x in − 0 . 9)) out - My neural network : 2 layers, 20 neurons per each, trained for 10 sec. Neural network is extremely useful Pred Ans x (true) or in capturing nonlinearity 0.03 0.03 x out out 0.02 0.02 0.01 0.01 1.0x 1.0x x in 0.2 0.4 0.6 0.8 0.2 0.4 0.6 0.8 - 0.01 - 0.01 Ryusuke Jinno 13 / 29 1805.12153

  23. NEURAL NETWORK: IMAGE RECOGNITION Image classifier: nolinear relation btwn. input (= image) and output (= label) cat label for cat 0 M dog label for dog 1 machine Ryusuke Jinno 14 / 29 1805.12153

  24. NEURAL NETWORK: IMAGE RECOGNITION Image classifier: nolinear relation btwn. input (= image) and output (= label) Input layer Output layer 0.8 0.1 0.7 0 1 Note : precisely, output layer is log-odds log [ P (cat) /P (dog)] Note : actual image recognition is not this simple, e.g. CNN Ryusuke Jinno 14 / 29 1805.12153

  25. TALK PLAN 1. Machine learning : lightning introduction ✔ 2. Machine learning meets tunneling problem in QFT 3. Data taking / Machine setup / Training process / Results 4. Summary Ryusuke Jinno / 29 1805.12153

  26. TUNNELING PROBLEM IN QFT − V φ Ryusuke Jinno / 29 1805.12153

Recommend


More recommend