The spectral theorem Now we know there is a real unit eigenvector v with the real eigenvalue λ . Consider the orthogonal complement v ⊥ . This is preserved by M :
The spectral theorem Now we know there is a real unit eigenvector v with the real eigenvalue λ . Consider the orthogonal complement v ⊥ . This is preserved by M : if w T v = 0, then ( M w ) T v = w T M v = λ w T v = 0
The spectral theorem Now we know there is a real unit eigenvector v with the real eigenvalue λ . Consider the orthogonal complement v ⊥ . This is preserved by M : if w T v = 0, then ( M w ) T v = w T M v = λ w T v = 0 Pick an orthonormal basis w 2 , . . . , w n of v ⊥ .
The spectral theorem Now we know there is a real unit eigenvector v with the real eigenvalue λ . Consider the orthogonal complement v ⊥ . This is preserved by M : if w T v = 0, then ( M w ) T v = w T M v = λ w T v = 0 Pick an orthonormal basis w 2 , . . . , w n of v ⊥ . In the basis v , w 2 , w 3 , . . . , w n , the matrix M takes the shape v T M v 0 · · · 0 w T w T 0 2 M w 2 · · · 2 M w n M = . . . ... . . . . . . w T w T 0 · · · n M w 2 n M w n
The spectral theorem Now we know there is a real unit eigenvector v with the real eigenvalue λ . Consider the orthogonal complement v ⊥ . This is preserved by M : if w T v = 0, then ( M w ) T v = w T M v = λ w T v = 0 Pick an orthonormal basis w 2 , . . . , w n of v ⊥ . In the basis v , w 2 , w 3 , . . . , w n , the matrix M takes the shape v T M v 0 · · · 0 w T w T 0 2 M w 2 · · · 2 M w n M = . . . ... . . . . . . w T w T 0 · · · n M w 2 n M w n The lower-right block is a smaller symmetric matrix; we are done by induction.
The spectral theorem Theorem A symmetric matrix M has all real eigenvalues and an orthonormal basis of eigenvectors.
The spectral theorem: reformulation
The spectral theorem: reformulation Recall that a square matrix O is said to be orthogonal if O T = O − 1 ;
The spectral theorem: reformulation Recall that a square matrix O is said to be orthogonal if O T = O − 1 ; equivalently, if the columns are orthonormal;
The spectral theorem: reformulation Recall that a square matrix O is said to be orthogonal if O T = O − 1 ; equivalently, if the columns are orthonormal; equivalently, if the rows are orthonormal.
The spectral theorem: reformulation Recall that a square matrix O is said to be orthogonal if O T = O − 1 ; equivalently, if the columns are orthonormal; equivalently, if the rows are orthonormal. Theorem If M is symmetric, there is an orthogonal matrix O and a real diagonal matrix D with M = ODO − 1 = ODO T
The spectral theorem: reformulation Recall that a square matrix O is said to be orthogonal if O T = O − 1 ; equivalently, if the columns are orthonormal; equivalently, if the rows are orthonormal. Theorem If M is symmetric, there is an orthogonal matrix O and a real diagonal matrix D with M = ODO − 1 = ODO T Proof: the columns of O are orthonormal eigenvectors of M .
The spectral theorem: reformulation Theorem If M is symmetric, there is an orthonormal basis v i and real numbers λ i such that M = λ 1 v 1 v T 1 + · · · + λ n v n v T n
The spectral theorem: reformulation Theorem If M is symmetric, there is an orthonormal basis v i and real numbers λ i such that M = λ 1 v 1 v T 1 + · · · + λ n v n v T n Proof: the v i are the eigenvectors of M and the λ i are their eigenvalues.
The spectral theorem: reformulation Theorem If M is symmetric, there is an orthonormal basis v i and real numbers λ i such that M = λ 1 v 1 v T 1 + · · · + λ n v n v T n Proof: the v i are the eigenvectors of M and the λ i are their eigenvalues. i ) 2 = v i v T Note by orthonormality, ( v i v T i , and ( v i v T i )( v j v T j ) = 0 when i � = j .
What does positive mean? Suppose M is a symmetric matrix. When is it true that: w T M w ≥ 0 , with equality only when w = 0
What does positive mean? Suppose M is a symmetric matrix. When is it true that: w T M w ≥ 0 , with equality only when w = 0 Claim: if and only if M has only positive eigenvalues.
What does positive mean? Suppose M is a symmetric matrix. When is it true that: w T M w ≥ 0 , with equality only when w = 0 Claim: if and only if M has only positive eigenvalues. Take an orthonormal basis v i of eigenvectors for M , with eigenvalues λ i .
What does positive mean? Suppose M is a symmetric matrix. When is it true that: w T M w ≥ 0 , with equality only when w = 0 Claim: if and only if M has only positive eigenvalues. Take an orthonormal basis v i of eigenvectors for M , with eigenvalues λ i . Note v T i M v i = λ i v T i v i = λ i .
What does positive mean? Suppose M is a symmetric matrix. When is it true that: w T M w ≥ 0 , with equality only when w = 0 Claim: if and only if M has only positive eigenvalues. Take an orthonormal basis v i of eigenvectors for M , with eigenvalues λ i . Note v T i M v i = λ i v T i v i = λ i . So if any of the λ i is ≤ 0, then certainly M is not positive.
What does positive mean? Suppose M is a symmetric matrix. When is it true that: w T M w ≥ 0 , with equality only when w = 0 Claim: if and only if M has only positive eigenvalues. Take an orthonormal basis v i of eigenvectors for M , with eigenvalues λ i . Note v T i M v i = λ i v T i v i = λ i . So if any of the λ i is ≤ 0, then certainly M is not positive. On the other hand, expand any w = ( � w i v i ).
What does positive mean? Suppose M is a symmetric matrix. When is it true that: w T M w ≥ 0 , with equality only when w = 0 Claim: if and only if M has only positive eigenvalues. Take an orthonormal basis v i of eigenvectors for M , with eigenvalues λ i . Note v T i M v i = λ i v T i v i = λ i . So if any of the λ i is ≤ 0, then certainly M is not positive. On the other hand, expand any w = ( � w i v i ). Then w T M w = ( � w i v i ) T M ( � � w 2 w i v i ) = i λ i
What does positive mean? Suppose M is a symmetric matrix. When is it true that: w T M w ≥ 0 , with equality only when w = 0 Claim: if and only if M has only positive eigenvalues. Take an orthonormal basis v i of eigenvectors for M , with eigenvalues λ i . Note v T i M v i = λ i v T i v i = λ i . So if any of the λ i is ≤ 0, then certainly M is not positive. On the other hand, expand any w = ( � w i v i ). Then w T M w = ( � w i v i ) T M ( � � w 2 w i v i ) = i λ i This is certainly positive if the λ i are positive and w � = 0.
Square-roots of positive matrices
Square-roots of positive matrices Positive numbers have square-roots.
Square-roots of positive matrices Positive numbers have square-roots. So do positive symmetric matrices:
Square-roots of positive matrices Positive numbers have square-roots. So do positive symmetric matrices: if A = ODO − 1 with D having all non-negative entries,
Square-roots of positive matrices Positive numbers have square-roots. So do positive symmetric matrices: if A = ODO − 1 with D having √ all non-negative entries, then we can write D for the diagonal matrix whose entries are the square-roots of D ’s entries, and:
Square-roots of positive matrices Positive numbers have square-roots. So do positive symmetric matrices: if A = ODO − 1 with D having √ all non-negative entries, then we can write D for the diagonal matrix whose entries are the square-roots of D ’s entries, and:
Square-roots of positive matrices Positive numbers have square-roots. So do positive symmetric matrices: if A = ODO − 1 with D having √ all non-negative entries, then we can write D for the diagonal matrix whose entries are the square-roots of D ’s entries, and: √ √ DO − 1 A := O
Square-roots of positive matrices Positive numbers have square-roots. So do positive symmetric matrices: if A = ODO − 1 with D having √ all non-negative entries, then we can write D for the diagonal matrix whose entries are the square-roots of D ’s entries, and: √ √ DO − 1 A := O √ Note that A is again symmetric and positive.
Stretching and shrinking
Stretching and shrinking How big or small can be | A w | | w | ?
Stretching and shrinking How big or small can be | A w | | w | ? If A is symmetric, then we take a basis of orthonormal eigenvectors v i with eigenvalues λ i and write:
Stretching and shrinking How big or small can be | A w | | w | ? If A is symmetric, then we take a basis of orthonormal eigenvectors v i with eigenvalues λ i and write: � w = w i v i
Stretching and shrinking How big or small can be | A w | | w | ? If A is symmetric, then we take a basis of orthonormal eigenvectors v i with eigenvalues λ i and write: � w = w i v i � A w = w i λ i v i From this it is not hard to see
Stretching and shrinking How big or small can be | A w | | w | ? If A is symmetric, then we take a basis of orthonormal eigenvectors v i with eigenvalues λ i and write: � w = w i v i � A w = w i λ i v i From this it is not hard to see �� w 2 i λ 2 min( | λ i | ) ≤ | A w | i � w 2 | w | = ≤ max( | λ i | ) i
Stretching and shrinking
Stretching and shrinking What about for matrices A in general?
Stretching and shrinking What about for matrices A in general? Maybe not even square?
Stretching and shrinking What about for matrices A in general? Maybe not even square? | A v | 2 = ( A v ) T ( A v ) = v T A T A v
Stretching and shrinking What about for matrices A in general? Maybe not even square? | A v | 2 = ( A v ) T ( A v ) = v T A T A v Note the matrix A T A is (square and) symmetric.
Stretching and shrinking What about for matrices A in general? Maybe not even square? | A v | 2 = ( A v ) T ( A v ) = v T A T A v Note the matrix A T A is (square and) symmetric. It’s also non-negative:
Stretching and shrinking What about for matrices A in general? Maybe not even square? | A v | 2 = ( A v ) T ( A v ) = v T A T A v Note the matrix A T A is (square and) symmetric. It’s also non-negative: v T A T A v = | A v | 2 ≥ 0.
Stretching and shrinking What about for matrices A in general? Maybe not even square? | A v | 2 = ( A v ) T ( A v ) = v T A T A v Note the matrix A T A is (square and) symmetric. It’s also non-negative: v T A T A v = | A v | 2 ≥ 0. √ So it has a symmetric non-negative square-root B = A T A .
Stretching and shrinking What about for matrices A in general? Maybe not even square? | A v | 2 = ( A v ) T ( A v ) = v T A T A v Note the matrix A T A is (square and) symmetric. It’s also non-negative: v T A T A v = | A v | 2 ≥ 0. √ So it has a symmetric non-negative square-root B = A T A . So we reduced the problem to the symmetric case, since | A v | 2 = v T A T A v = v T B 2 v = v T B T B v = | B v | 2
Singular values
Singular values Let v i be an orthonormal basis of eigenvectors for A T A , with eigenvalues λ i . We order them so that λ 1 ≥ λ 2 ≥ · · · .
Singular values Let v i be an orthonormal basis of eigenvectors for A T A , with eigenvalues λ i . We order them so that λ 1 ≥ λ 2 ≥ · · · . As we have seen, all λ i ≥ 0.
Singular values Let v i be an orthonormal basis of eigenvectors for A T A , with eigenvalues λ i . We order them so that λ 1 ≥ λ 2 ≥ · · · . As we have seen, all λ i ≥ 0. We write σ i = √ λ i .
Singular values Let v i be an orthonormal basis of eigenvectors for A T A , with eigenvalues λ i . We order them so that λ 1 ≥ λ 2 ≥ · · · . As we have seen, all λ i ≥ 0. We write σ i = √ λ i . The σ i are called the singular values of A .
Singular values Let v i be an orthonormal basis of eigenvectors for A T A , with eigenvalues λ i . We order them so that λ 1 ≥ λ 2 ≥ · · · . As we have seen, all λ i ≥ 0. We write σ i = √ λ i . The σ i are called the singular values of A . Note: ( A v i ) · ( A v j ) = v T j A T A v i = 0 i � = j
Singular values Let v i be an orthonormal basis of eigenvectors for A T A , with eigenvalues λ i . We order them so that λ 1 ≥ λ 2 ≥ · · · . As we have seen, all λ i ≥ 0. We write σ i = √ λ i . The σ i are called the singular values of A . Note: ( A v i ) · ( A v j ) = v T j A T A v i = 0 i � = j ( A v i ) · ( A v i ) = v T i A T A v i = σ 2 i v T i v i = σ 2 i
Singular value decomposition So the v i are an orthonormal basis whose images are also orthogonal.
Singular value decomposition So the v i are an orthonormal basis whose images are also orthogonal. We rescale the images to the orthonormal u i := 1 A v i σ i
Singular value decomposition So the v i are an orthonormal basis whose images are also orthogonal. We rescale the images to the orthonormal u i := 1 A v i σ i We extend the u i to an orthonormal basis.
Singular value decomposition So the v i are an orthonormal basis whose images are also orthogonal. We rescale the images to the orthonormal u i := 1 A v i σ i We extend the u i to an orthonormal basis. The matrices U , V whose columns are the basis vectors u i , v i are orthogonal:
Recommend
More recommend