basic calculus review
play

Basic Calculus Review CBMM Summer Course, Day 2 - Machine Learning - PowerPoint PPT Presentation

Basic Calculus Review CBMM Summer Course, Day 2 - Machine Learning Vector Spaces Functionals and Operators (Matrices) Vector Space A vector space is a set V with binary operations +: V V V and : R V V such that for all a ,


  1. Basic Calculus Review CBMM Summer Course, Day 2 - Machine Learning

  2. Vector Spaces Functionals and Operators (Matrices)

  3. Vector Space ◮ A vector space is a set V with binary operations +: V × V → V and · : R × V → V such that for all a , b ∈ R and v , w , x ∈ V : 1. v + w = w + v 2. ( v + w ) + x = v + ( w + x ) 3. There exists 0 ∈ V such that v + 0 = v for all v ∈ V 4. For every v ∈ V there exists − v ∈ V such that v + (− v ) = 0 5. a ( bv ) = ( ab ) v 6. 1 v = v 7. ( a + b ) v = av + bv 8. a ( v + w ) = av + aw

  4. Vector Space ◮ A vector space is a set V with binary operations +: V × V → V and · : R × V → V such that for all a , b ∈ R and v , w , x ∈ V : 1. v + w = w + v 2. ( v + w ) + x = v + ( w + x ) 3. There exists 0 ∈ V such that v + 0 = v for all v ∈ V 4. For every v ∈ V there exists − v ∈ V such that v + (− v ) = 0 5. a ( bv ) = ( ab ) v 6. 1 v = v 7. ( a + b ) v = av + bv 8. a ( v + w ) = av + aw ◮ Example: R n , space of polynomials, space of functions.

  5. Inner Product ◮ An inner product is a function �· , ·� : V × V → R such that for all a , b ∈ R and v , w , x ∈ V :

  6. Inner Product ◮ An inner product is a function �· , ·� : V × V → R such that for all a , b ∈ R and v , w , x ∈ V : 1. � v , w � = � w , v � 2. � av + bw , x � = a � v , x � + b � w , x � 3. � v , v � � 0 and � v , v � = 0 if and only if v = 0.

  7. Inner Product ◮ An inner product is a function �· , ·� : V × V → R such that for all a , b ∈ R and v , w , x ∈ V : 1. � v , w � = � w , v � 2. � av + bw , x � = a � v , x � + b � w , x � 3. � v , v � � 0 and � v , v � = 0 if and only if v = 0. ◮ v , w ∈ V are orthogonal if � v , w � = 0.

  8. Inner Product ◮ An inner product is a function �· , ·� : V × V → R such that for all a , b ∈ R and v , w , x ∈ V : 1. � v , w � = � w , v � 2. � av + bw , x � = a � v , x � + b � w , x � 3. � v , v � � 0 and � v , v � = 0 if and only if v = 0. ◮ v , w ∈ V are orthogonal if � v , w � = 0. ◮ Given W ⊆ V , we have V = W ⊕ W ⊥ , where W ⊥ = { v ∈ V | � v , w � = 0 for all w ∈ W } .

  9. Inner Product ◮ An inner product is a function �· , ·� : V × V → R such that for all a , b ∈ R and v , w , x ∈ V : 1. � v , w � = � w , v � 2. � av + bw , x � = a � v , x � + b � w , x � 3. � v , v � � 0 and � v , v � = 0 if and only if v = 0. ◮ v , w ∈ V are orthogonal if � v , w � = 0. ◮ Given W ⊆ V , we have V = W ⊕ W ⊥ , where W ⊥ = { v ∈ V | � v , w � = 0 for all w ∈ W } . ◮ Cauchy-Schwarz inequality: � v , w � � � v , v � 1 / 2 � w , w � 1 / 2 .

  10. Norm ◮ Can define norm from inner product: � v � = � v , v � 1 / 2 .

  11. Norm ◮ A norm is a function � · � : V → R such that for all a ∈ R and v , w ∈ V : 1. � v � � 0, and � v � = 0 if and only if v = 0 2. � av � = | a | � v � 3. � v + w � � � v � + � w � ◮ Can define norm from inner product: � v � = � v , v � 1 / 2 .

  12. Metric ◮ Can define metric from norm: d ( v , w ) = � v − w � .

  13. Metric ◮ A metric is a function d : V × V → R such that for all v , w , x ∈ V : 1. d ( v , w ) � 0, and d ( v , w ) = 0 if and only if v = w 2. d ( v , w ) = d ( w , v ) 3. d ( v , w ) � d ( v , x ) + d ( x , w ) ◮ Can define metric from norm: d ( v , w ) = � v − w � .

  14. Basis ◮ B = { v 1 , . . . , v n } is a basis of V if every v ∈ V can be uniquely decomposed as v = a 1 v 1 + · · · + a n v n for some a 1 , . . . , a n ∈ R .

  15. Basis ◮ B = { v 1 , . . . , v n } is a basis of V if every v ∈ V can be uniquely decomposed as v = a 1 v 1 + · · · + a n v n for some a 1 , . . . , a n ∈ R . ◮ An orthonormal basis is a basis that is orthogonal ( � v i , v j � = 0 for i � = j ) and normalized ( � v i � = 1).

  16. Vector Spaces Functionals and Operators (Matrices)

  17. Maps Next we are going to review basic properties of maps on a Hilbert space. ◮ functionals: Ψ : H → R ◮ linear operators A : H → H , such that A ( af + bg ) = aAf + bAg , with a , b ∈ R and f , g ∈ H .

  18. Representation of Continuous Functionals Let H be a Hilbert space and g ∈ H , then Ψ g ( f ) = � f , g � , f ∈ H is a continuous linear functional. Riesz representation theorem The theorem states that every continuous linear functional Ψ can be written uniquely in the form, Ψ ( f ) = � f , g � for some appropriate element g ∈ H .

  19. Matrix ◮ Every linear operator L : R m → R n can be represented by an m × n matrix A .

  20. Matrix ◮ Every linear operator L : R m → R n can be represented by an m × n matrix A . ◮ If A ∈ R m × n , the transpose of A is A ⊤ ∈ R n × m satisfying � Ax , y � R m = ( Ax ) ⊤ y = x ⊤ A ⊤ y = � x , A ⊤ y � R n for every x ∈ R n and y ∈ R m .

  21. Matrix ◮ Every linear operator L : R m → R n can be represented by an m × n matrix A . ◮ If A ∈ R m × n , the transpose of A is A ⊤ ∈ R n × m satisfying � Ax , y � R m = ( Ax ) ⊤ y = x ⊤ A ⊤ y = � x , A ⊤ y � R n for every x ∈ R n and y ∈ R m . ◮ A is symmetric if A ⊤ = A .

  22. Eigenvalues and Eigenvectors ◮ Let A ∈ R n × n . A nonzero vector v ∈ R n is an eigenvector of A with corresponding eigenvalue λ ∈ R if Av = λv .

  23. Eigenvalues and Eigenvectors ◮ Let A ∈ R n × n . A nonzero vector v ∈ R n is an eigenvector of A with corresponding eigenvalue λ ∈ R if Av = λv . ◮ Symmetric matrices have real eigenvalues.

  24. Eigenvalues and Eigenvectors ◮ Let A ∈ R n × n . A nonzero vector v ∈ R n is an eigenvector of A with corresponding eigenvalue λ ∈ R if Av = λv . ◮ Symmetric matrices have real eigenvalues. ◮ Spectral Theorem: Let A be a symmetric n × n matrix. Then there is an orthonormal basis of R n consisting of the eigenvectors of A .

  25. Eigenvalues and Eigenvectors ◮ Let A ∈ R n × n . A nonzero vector v ∈ R n is an eigenvector of A with corresponding eigenvalue λ ∈ R if Av = λv . ◮ Symmetric matrices have real eigenvalues. ◮ Spectral Theorem: Let A be a symmetric n × n matrix. Then there is an orthonormal basis of R n consisting of the eigenvectors of A . ◮ Eigendecomposition: A = VΛV ⊤ , or equivalently, n � λ i v i v ⊤ A = i . i = 1

  26. Singular Value Decomposition ◮ Every A ∈ R m × n can be written as A = UΣV ⊤ , where U ∈ R m × m is orthogonal, Σ ∈ R m × n is diagonal, and V ∈ R n × n is orthogonal.

  27. Singular Value Decomposition ◮ Every A ∈ R m × n can be written as A = UΣV ⊤ , where U ∈ R m × m is orthogonal, Σ ∈ R m × n is diagonal, and V ∈ R n × n is orthogonal. ◮ Singular system: AA ⊤ u i = σ 2 Av i = σ i u i i u i A ⊤ u i = σ i v i A ⊤ Av i = σ 2 i v i

  28. Matrix Norm ◮ The spectral norm of A ∈ R m × n is � � λ max ( AA ⊤ ) = λ max ( A ⊤ A ) . � A � spec = σ max ( A ) =

  29. Matrix Norm ◮ The spectral norm of A ∈ R m × n is � � λ max ( AA ⊤ ) = λ max ( A ⊤ A ) . � A � spec = σ max ( A ) = ◮ The Frobenius norm of A ∈ R m × n is � � min { m , n } m n � � � � � � � a 2 σ 2 � A � F = ij = i . � � i = 1 j = 1 i = 1

  30. Positive Definite Matrix A real symmetric matrix A ∈ R m × m is positive definite if x t Ax > 0, ∀ x ∈ R m . A positive definite matrix has positive eigenvalues. Note: for positive semi-definite matrices > is replaced by � .

Recommend


More recommend