linear algebra and differential equations math 54 lecture
play

Linear algebra and differential equations (Math 54): Lecture 18 - PowerPoint PPT Presentation

Linear algebra and differential equations (Math 54): Lecture 18 Vivek Shende April 3, 2019 Hello and welcome to class! Hello and welcome to class! Last time Hello and welcome to class! Last time We discussed the least squares problem,


  1. The spectral theorem Now we know there is a real unit eigenvector v with the real eigenvalue λ . Consider the orthogonal complement v ⊥ . This is preserved by M :

  2. The spectral theorem Now we know there is a real unit eigenvector v with the real eigenvalue λ . Consider the orthogonal complement v ⊥ . This is preserved by M : if w T v = 0, then ( M w ) T v = w T M v = λ w T v = 0

  3. The spectral theorem Now we know there is a real unit eigenvector v with the real eigenvalue λ . Consider the orthogonal complement v ⊥ . This is preserved by M : if w T v = 0, then ( M w ) T v = w T M v = λ w T v = 0 Pick an orthonormal basis w 2 , . . . , w n of v ⊥ .

  4. The spectral theorem Now we know there is a real unit eigenvector v with the real eigenvalue λ . Consider the orthogonal complement v ⊥ . This is preserved by M : if w T v = 0, then ( M w ) T v = w T M v = λ w T v = 0 Pick an orthonormal basis w 2 , . . . , w n of v ⊥ . In the basis v , w 2 , w 3 , . . . , w n , the matrix M takes the shape  v T M v  0 · · · 0 w T w T 0 2 M w 2 · · · 2 M w n   M =  . . .  ... . . .   . . .   w T w T 0 · · · n M w 2 n M w n

  5. The spectral theorem Now we know there is a real unit eigenvector v with the real eigenvalue λ . Consider the orthogonal complement v ⊥ . This is preserved by M : if w T v = 0, then ( M w ) T v = w T M v = λ w T v = 0 Pick an orthonormal basis w 2 , . . . , w n of v ⊥ . In the basis v , w 2 , w 3 , . . . , w n , the matrix M takes the shape  v T M v  0 · · · 0 w T w T 0 2 M w 2 · · · 2 M w n   M =  . . .  ... . . .   . . .   w T w T 0 · · · n M w 2 n M w n The lower-right block is a smaller symmetric matrix; we are done by induction.

  6. The spectral theorem Theorem A symmetric matrix M has all real eigenvalues and an orthonormal basis of eigenvectors.

  7. The spectral theorem: reformulation

  8. The spectral theorem: reformulation Recall that a square matrix O is said to be orthogonal if O T = O − 1 ;

  9. The spectral theorem: reformulation Recall that a square matrix O is said to be orthogonal if O T = O − 1 ; equivalently, if the columns are orthonormal;

  10. The spectral theorem: reformulation Recall that a square matrix O is said to be orthogonal if O T = O − 1 ; equivalently, if the columns are orthonormal; equivalently, if the rows are orthonormal.

  11. The spectral theorem: reformulation Recall that a square matrix O is said to be orthogonal if O T = O − 1 ; equivalently, if the columns are orthonormal; equivalently, if the rows are orthonormal. Theorem If M is symmetric, there is an orthogonal matrix O and a real diagonal matrix D with M = ODO − 1 = ODO T

  12. The spectral theorem: reformulation Recall that a square matrix O is said to be orthogonal if O T = O − 1 ; equivalently, if the columns are orthonormal; equivalently, if the rows are orthonormal. Theorem If M is symmetric, there is an orthogonal matrix O and a real diagonal matrix D with M = ODO − 1 = ODO T Proof: the columns of O are orthonormal eigenvectors of M .

  13. The spectral theorem: reformulation Theorem If M is symmetric, there is an orthonormal basis v i and real numbers λ i such that M = λ 1 v 1 v T 1 + · · · + λ n v n v T n

  14. The spectral theorem: reformulation Theorem If M is symmetric, there is an orthonormal basis v i and real numbers λ i such that M = λ 1 v 1 v T 1 + · · · + λ n v n v T n Proof: the v i are the eigenvectors of M and the λ i are their eigenvalues.

  15. The spectral theorem: reformulation Theorem If M is symmetric, there is an orthonormal basis v i and real numbers λ i such that M = λ 1 v 1 v T 1 + · · · + λ n v n v T n Proof: the v i are the eigenvectors of M and the λ i are their eigenvalues. i ) 2 = v i v T Note by orthonormality, ( v i v T i , and ( v i v T i )( v j v T j ) = 0 when i � = j .

  16. What does positive mean? Suppose M is a symmetric matrix. When is it true that: w T M w ≥ 0 , with equality only when w = 0

  17. What does positive mean? Suppose M is a symmetric matrix. When is it true that: w T M w ≥ 0 , with equality only when w = 0 Claim: if and only if M has only positive eigenvalues.

  18. What does positive mean? Suppose M is a symmetric matrix. When is it true that: w T M w ≥ 0 , with equality only when w = 0 Claim: if and only if M has only positive eigenvalues. Take an orthonormal basis v i of eigenvectors for M , with eigenvalues λ i .

  19. What does positive mean? Suppose M is a symmetric matrix. When is it true that: w T M w ≥ 0 , with equality only when w = 0 Claim: if and only if M has only positive eigenvalues. Take an orthonormal basis v i of eigenvectors for M , with eigenvalues λ i . Note v T i M v i = λ i v T i v i = λ i .

  20. What does positive mean? Suppose M is a symmetric matrix. When is it true that: w T M w ≥ 0 , with equality only when w = 0 Claim: if and only if M has only positive eigenvalues. Take an orthonormal basis v i of eigenvectors for M , with eigenvalues λ i . Note v T i M v i = λ i v T i v i = λ i . So if any of the λ i is ≤ 0, then certainly M is not positive.

  21. What does positive mean? Suppose M is a symmetric matrix. When is it true that: w T M w ≥ 0 , with equality only when w = 0 Claim: if and only if M has only positive eigenvalues. Take an orthonormal basis v i of eigenvectors for M , with eigenvalues λ i . Note v T i M v i = λ i v T i v i = λ i . So if any of the λ i is ≤ 0, then certainly M is not positive. On the other hand, expand any w = ( � w i v i ).

  22. What does positive mean? Suppose M is a symmetric matrix. When is it true that: w T M w ≥ 0 , with equality only when w = 0 Claim: if and only if M has only positive eigenvalues. Take an orthonormal basis v i of eigenvectors for M , with eigenvalues λ i . Note v T i M v i = λ i v T i v i = λ i . So if any of the λ i is ≤ 0, then certainly M is not positive. On the other hand, expand any w = ( � w i v i ). Then w T M w = ( � w i v i ) T M ( � � w 2 w i v i ) = i λ i

  23. What does positive mean? Suppose M is a symmetric matrix. When is it true that: w T M w ≥ 0 , with equality only when w = 0 Claim: if and only if M has only positive eigenvalues. Take an orthonormal basis v i of eigenvectors for M , with eigenvalues λ i . Note v T i M v i = λ i v T i v i = λ i . So if any of the λ i is ≤ 0, then certainly M is not positive. On the other hand, expand any w = ( � w i v i ). Then w T M w = ( � w i v i ) T M ( � � w 2 w i v i ) = i λ i This is certainly positive if the λ i are positive and w � = 0.

  24. Square-roots of positive matrices

  25. Square-roots of positive matrices Positive numbers have square-roots.

  26. Square-roots of positive matrices Positive numbers have square-roots. So do positive symmetric matrices:

  27. Square-roots of positive matrices Positive numbers have square-roots. So do positive symmetric matrices: if A = ODO − 1 with D having all non-negative entries,

  28. Square-roots of positive matrices Positive numbers have square-roots. So do positive symmetric matrices: if A = ODO − 1 with D having √ all non-negative entries, then we can write D for the diagonal matrix whose entries are the square-roots of D ’s entries, and:

  29. Square-roots of positive matrices Positive numbers have square-roots. So do positive symmetric matrices: if A = ODO − 1 with D having √ all non-negative entries, then we can write D for the diagonal matrix whose entries are the square-roots of D ’s entries, and:

  30. Square-roots of positive matrices Positive numbers have square-roots. So do positive symmetric matrices: if A = ODO − 1 with D having √ all non-negative entries, then we can write D for the diagonal matrix whose entries are the square-roots of D ’s entries, and: √ √ DO − 1 A := O

  31. Square-roots of positive matrices Positive numbers have square-roots. So do positive symmetric matrices: if A = ODO − 1 with D having √ all non-negative entries, then we can write D for the diagonal matrix whose entries are the square-roots of D ’s entries, and: √ √ DO − 1 A := O √ Note that A is again symmetric and positive.

  32. Stretching and shrinking

  33. Stretching and shrinking How big or small can be | A w | | w | ?

  34. Stretching and shrinking How big or small can be | A w | | w | ? If A is symmetric, then we take a basis of orthonormal eigenvectors v i with eigenvalues λ i and write:

  35. Stretching and shrinking How big or small can be | A w | | w | ? If A is symmetric, then we take a basis of orthonormal eigenvectors v i with eigenvalues λ i and write: � w = w i v i

  36. Stretching and shrinking How big or small can be | A w | | w | ? If A is symmetric, then we take a basis of orthonormal eigenvectors v i with eigenvalues λ i and write: � w = w i v i � A w = w i λ i v i From this it is not hard to see

  37. Stretching and shrinking How big or small can be | A w | | w | ? If A is symmetric, then we take a basis of orthonormal eigenvectors v i with eigenvalues λ i and write: � w = w i v i � A w = w i λ i v i From this it is not hard to see �� w 2 i λ 2 min( | λ i | ) ≤ | A w | i � w 2 | w | = ≤ max( | λ i | ) i

  38. Stretching and shrinking

  39. Stretching and shrinking What about for matrices A in general?

  40. Stretching and shrinking What about for matrices A in general? Maybe not even square?

  41. Stretching and shrinking What about for matrices A in general? Maybe not even square? | A v | 2 = ( A v ) T ( A v ) = v T A T A v

  42. Stretching and shrinking What about for matrices A in general? Maybe not even square? | A v | 2 = ( A v ) T ( A v ) = v T A T A v Note the matrix A T A is (square and) symmetric.

  43. Stretching and shrinking What about for matrices A in general? Maybe not even square? | A v | 2 = ( A v ) T ( A v ) = v T A T A v Note the matrix A T A is (square and) symmetric. It’s also non-negative:

  44. Stretching and shrinking What about for matrices A in general? Maybe not even square? | A v | 2 = ( A v ) T ( A v ) = v T A T A v Note the matrix A T A is (square and) symmetric. It’s also non-negative: v T A T A v = | A v | 2 ≥ 0.

  45. Stretching and shrinking What about for matrices A in general? Maybe not even square? | A v | 2 = ( A v ) T ( A v ) = v T A T A v Note the matrix A T A is (square and) symmetric. It’s also non-negative: v T A T A v = | A v | 2 ≥ 0. √ So it has a symmetric non-negative square-root B = A T A .

  46. Stretching and shrinking What about for matrices A in general? Maybe not even square? | A v | 2 = ( A v ) T ( A v ) = v T A T A v Note the matrix A T A is (square and) symmetric. It’s also non-negative: v T A T A v = | A v | 2 ≥ 0. √ So it has a symmetric non-negative square-root B = A T A . So we reduced the problem to the symmetric case, since | A v | 2 = v T A T A v = v T B 2 v = v T B T B v = | B v | 2

  47. Singular values

  48. Singular values Let v i be an orthonormal basis of eigenvectors for A T A , with eigenvalues λ i . We order them so that λ 1 ≥ λ 2 ≥ · · · .

  49. Singular values Let v i be an orthonormal basis of eigenvectors for A T A , with eigenvalues λ i . We order them so that λ 1 ≥ λ 2 ≥ · · · . As we have seen, all λ i ≥ 0.

  50. Singular values Let v i be an orthonormal basis of eigenvectors for A T A , with eigenvalues λ i . We order them so that λ 1 ≥ λ 2 ≥ · · · . As we have seen, all λ i ≥ 0. We write σ i = √ λ i .

  51. Singular values Let v i be an orthonormal basis of eigenvectors for A T A , with eigenvalues λ i . We order them so that λ 1 ≥ λ 2 ≥ · · · . As we have seen, all λ i ≥ 0. We write σ i = √ λ i . The σ i are called the singular values of A .

  52. Singular values Let v i be an orthonormal basis of eigenvectors for A T A , with eigenvalues λ i . We order them so that λ 1 ≥ λ 2 ≥ · · · . As we have seen, all λ i ≥ 0. We write σ i = √ λ i . The σ i are called the singular values of A . Note: ( A v i ) · ( A v j ) = v T j A T A v i = 0 i � = j

  53. Singular values Let v i be an orthonormal basis of eigenvectors for A T A , with eigenvalues λ i . We order them so that λ 1 ≥ λ 2 ≥ · · · . As we have seen, all λ i ≥ 0. We write σ i = √ λ i . The σ i are called the singular values of A . Note: ( A v i ) · ( A v j ) = v T j A T A v i = 0 i � = j ( A v i ) · ( A v i ) = v T i A T A v i = σ 2 i v T i v i = σ 2 i

  54. Singular value decomposition So the v i are an orthonormal basis whose images are also orthogonal.

  55. Singular value decomposition So the v i are an orthonormal basis whose images are also orthogonal. We rescale the images to the orthonormal u i := 1 A v i σ i

  56. Singular value decomposition So the v i are an orthonormal basis whose images are also orthogonal. We rescale the images to the orthonormal u i := 1 A v i σ i We extend the u i to an orthonormal basis.

  57. Singular value decomposition So the v i are an orthonormal basis whose images are also orthogonal. We rescale the images to the orthonormal u i := 1 A v i σ i We extend the u i to an orthonormal basis. The matrices U , V whose columns are the basis vectors u i , v i are orthogonal:

Recommend


More recommend