solving differential equations through means of deep
play

Solving Differential Equations through Means of Deep Learning - PowerPoint PPT Presentation

Solving Differential Equations through Means of Deep Learning Juliane Braunsmann February 8, 2019 Table of Contents Neural Networks 1 Introduction to machine learning applied to PDEs 2 Learning with a residual loss... 3 ... and hard


  1. Solving Differential Equations through Means of Deep Learning Juliane Braunsmann February 8, 2019

  2. Table of Contents Neural Networks 1 Introduction to machine learning applied to PDEs 2 Learning with a residual loss... 3 ... and hard assignment of constraints ... and soft assignment of constraints Summary 4

  3. Table of Contents Neural Networks 1 Introduction to machine learning applied to PDEs 2 Learning with a residual loss... 3 ... and hard assignment of constraints ... and soft assignment of constraints Summary 4

  4. Solving Differential Equations through Means of Deep Learning Machine Learning Machine learning discovers rules to execute a data-processing task, given examples of what’s expected. — François Chollet, Deep Learning with Python [Cho17] We require three things: ▶ Input data points e.g. images for image tagging, speech for speech recognition ▶ Examples of the expected output e.g. tagged images, transcribed audio files ▶ A way to measure if the algorithm is doing a good job to determine if the output of the algorithm is close to the expected output, this enables learning 1 46 Juliane Braunsmann

  5. Solving Differential Equations through Means of Deep Learning Formalization Training data, consiting of pairs of input data and expected output : {( x 1 , y 1 ) , … , ( x N , y N )} ⊆ 𝒴 × 𝒵 , assumed to be i.i.d distributed samples (observations) from an unknown probability distribution P . Goal: Given a new input x ∈ 𝒴 , predict output ̂ y ∈ 𝒵 , i. e. find a prediction function f : x ↦ ̂ y . Assumption: ( x , y ) is another independent observation of P To measure if the prediction function is doing a good job, we have to define a loss function L : 𝒵 × 𝒵 → ℝ , where L ( y , ̂ y ) measures how close the expected output y is to the predicted output ̂ y . 2 46 Juliane Braunsmann

  6. Solving Differential Equations through Means of Deep Learning Typical losses Some typical loss functions are: squared loss 2 for 𝒵 = ℝ , L ( y , ̂ y ) = | y − ̂ y | zero-one loss L ( y , ̂ y ) = 𝟚 y ( ̂ y ) for arbitrary 𝒵 , cross-entropy loss L ( y , ̂ y ) = − ( y log ( ̂ y ) + ( 1 − y ) log ( 1 − ̂ y )) for 𝒵 = [ 0, 1 ] . 3 46 Juliane Braunsmann

  7. 𝒴 ∑ ∑ Solving Differential Equations through Means of Deep Learning Average loss The quality of a prediction function is given by E ( f ) = 𝔽 P [ L ( y , f ( x ))] = ∫ L ( y , f ( x )) d P ( x ) N N ≈ 1 L ( y i , f ( x i )) = E i ( f ) N i =1 i =1 Then find f ∗ = arg min E ( f ) , f where the type of f is determined by the applied prediction algorithm. 4 46 Juliane Braunsmann

  8. Solving Differential Equations through Means of Deep Learning Machine learning techniques There are many different types of prediction functions in machine learning: ▶ decision trees ▶ support vector machines ▶ naive Bayes classifiers ▶ k-nearest neighbor algorithm ▶ neural networks, specifically deep learning 5 46 Juliane Braunsmann

  9. Solving Differential Equations through Means of Deep Learning Machine learning techniques There are many different types of prediction functions in machine learning: ▶ decision trees ▶ support vector machines ▶ naive Bayes classifiers ▶ k-nearest neighbor algorithm ▶ neural networks, specifically deep learning 6 46 Juliane Braunsmann

  10. Solving Differential Equations through Means of Deep Learning Deep Learning Deep learning is a specific subfield of machine learning: a new take on learning representations from data that puts an emphasis on learning successive layers of increasingly meaningful representations. — François Chollet, Deep Learning with Python [Cho17] 7 46 Juliane Braunsmann

  11. ⋮ ∑ 𝜏 ( Solving Differential Equations through Means of Deep Learning Feedforward Neural Network 1 +1 b w 1 n x 1 w i x i + b ) w 2 i =1 Σ 𝜏 x 2 w 3 x 3 w n x n 1 https://github.com/PetarV-/TikZ/tree/master/Multilayer%20perceptron 8 46 Juliane Braunsmann

  12. ∑ ⋮ 𝜏 ( Solving Differential Equations through Means of Deep Learning Feedforward Neural Network 1 Input Hidden Output b +1 layer layer layer w 1 n w i x i + b ) x 1 w 2 i =1 Σ 𝜏 x 2 w 3 I 1 O 1 x 3 w n I 2 x n I 3 O 2 1 https://github.com/PetarV-/TikZ/tree/master/Multilayer%20perceptron 9 46 Juliane Braunsmann

  13. Solving Differential Equations through Means of Deep Learning Typical activation functions Some typical activation functions are: Sigmoid function 1 𝜏 ( z ) = 1 + exp ( − z ) Tanh function 𝜏 ( z ) = exp ( z ) − exp ( − z ) exp ( z ) + exp ( − z ) ReLu function 𝜏 ( z ) = max ( z , 0 ) 10 46 Juliane Braunsmann

  14. Solving Differential Equations through Means of Deep Learning Typical activation functions 1 ReLu (scaled) Tanh Sigmoid −1 11 46 Juliane Braunsmann

  15. ∑ ( 𝜏 ( ∑ ∑ Solving Differential Equations through Means of Deep Learning Formalization of (feed-forward) neural network Given an input vector z l ∈ ℝ n , the output of the fully connected layer l + 1 is n w l i , j z i + b l ∈ ℝ m . i )) i =1 j =1, … , m Such layers can be concatenated, yielding a deep neural network parametrized by weight matrices W l and bias vectors b l for each layer. We denote these parameters by 𝜄 and the corresponding neural network by f 𝜄 . We write N N E i ( 𝜄 ) = 1 E ( 𝜄 ) = L ( y i , f 𝜄 ( x i )) . N i =1 i =1 12 46 Juliane Braunsmann

  16. Solving Differential Equations through Means of Deep Learning Training loop Source: [Cho17] 13 46 Juliane Braunsmann

Recommend


More recommend