1 v 1 u 1 u m a u v
play

1 ... v 1 u 1 | | u m . A = - PowerPoint PPT Presentation

Lesson 20 S INGULAR V ALUE D ECOMPOSITION Given a matrix A C m n , m n , the singular value decomposition (SVD) is the factorization 1 ... v 1 u 1 | | u m . A =


  1. Lesson 20 S INGULAR V ALUE D ECOMPOSITION

  2. • Given a matrix A ∈ C m × n , m ≥ n , the singular value decomposition (SVD) is the factorization � � σ 1 ... � � � � v � � � 1 � � � u 1 | · · · | u m . A = U Σ V � = σ n � � � . � � . � � 0 � � � � v � � � . n . � � . � � 0 where σ k ≥ 0 and U, V are unitary • The SVD is fundamental in applied mathematics • Some applications (2 of thousands): � Image compression � Principle component analysis in statistics

  3. • Given a matrix A ∈ C m × n , m ≥ n , the singular value decomposition (SVD) is the factorization � � σ 1 ... � � � � v � � � 1 � � � u 1 | · · · | u m . A = U Σ V � = σ n � � � . � � . � � 0 � � � � v � � � . n . � � . � � 0 where σ k ≥ 0 and U, V are unitary • The SVD is fundamental in applied mathematics • Some applications (2 of thousands): � Image compression � Principle component analysis in statistics

  4. C ONNECTION WITH 
 EIGENVALUE DECOMPOSITION OF 
 SYMMETRIC MATRIX

  5. ��� ��� • Suppose A ∈ R n � n is symmetric, A � = A , with eigenvectors,eigenvalues A v k = λ k v k • Recall that the eigenvectors are orthogonal (assume are distinct): • We can normalize so that is orthogonal and satisfies i.e., • We aren't quite done: we have to make sure is positive: i.e., the singular values are the absolute value of the eigenvalues

  6. ��� ��� • Suppose A ∈ R n � n is symmetric, A � = A , with eigenvectors,eigenvalues A v k = λ k v k • Recall that the eigenvectors { v 1 , . . . , v n } are orthogonal (assume λ k are distinct): k v j = 1 ( A v k ) � v j = 1 v � v � k A v j λ k λ k • We can normalize so that is orthogonal and satisfies i.e., • We aren't quite done: we have to make sure is positive: i.e., the singular values are the absolute value of the eigenvalues

  7. ��� ��� • Suppose A ∈ R n � n is symmetric, A � = A , with eigenvectors,eigenvalues A v k = λ k v k • Recall that the eigenvectors { v 1 , . . . , v n } are orthogonal (assume λ k are distinct): k v j = 1 ( A v k ) � v j = 1 v � v � k A v j λ k λ k = λ j v � k v j λ k • We can normalize so that is orthogonal and � � satisfies i.e., • We aren't quite done: we have to make sure is positive: i.e., the singular values are the absolute value of the eigenvalues

  8. ��� ��� • Suppose A ∈ R n � n is symmetric, A � = A , with eigenvectors,eigenvalues A v k = λ k v k • Recall that the eigenvectors { v 1 , . . . , v n } are orthogonal (assume λ k are distinct): k v j = 1 ( A v k ) � v j = 1 v � v � k A v j λ k λ k = λ j v � k v j λ k � q 1 | · · · | q n • We can normalize q k := v k so that Q = is orthogonal and � satisfies AQ = Q Λ i.e., A = Q Λ Q � • We aren't quite done: we have to make sure is positive: i.e., the singular values are the absolute value of the eigenvalues

  9. • Suppose A ∈ R n � n is symmetric, A � = A , with eigenvectors,eigenvalues A v k = λ k v k • Recall that the eigenvectors { v 1 , . . . , v n } are orthogonal (assume λ k are distinct): k v j = 1 ( A v k ) � v j = 1 v � v � k A v j λ k λ k = λ j v � k v j λ k � q 1 | · · · | q n • We can normalize q k := v k so that Q = is orthogonal and � satisfies AQ = Q Λ i.e., A = Q Λ Q � • We aren't quite done: we have to make sure Λ is positive: � � � � | λ 1 | ��� λ 1 ... ... � Q � A = Q � � � � � � � | λ n | ��� λ n = Q Σ V � , i.e., the singular values are the absolute value of the eigenvalues

  10. E XISTENCE AND UNIQUENESS

  11. ������� : Every matrix A in C m � n has an SVD. • Set σ 1 = � A � � 2 . By compactness, we know there exists u 1 , v 1 satisfying � u 1 � = � v 1 � = 1 with A v 1 = σ 1 u 1 . • Assume without loss of generality that , and use the QR decomposition to factor so that is orthogonal and spans , and similarly define • Then we have where • We have implying that � as otherwise

  12. ������� : Every matrix A in C m � n has an SVD. • Set σ 1 = � A � � 2 . By compactness, we know there exists u 1 , v 1 satisfying � u 1 � = � v 1 � = 1 with A v 1 = σ 1 u 1 . • Assume without loss of generality that e � 1 v 1 � = 0 , and use the QR decomposition to factor � v 1 | e 2 | · · · | e n � v 1 | · · · | v n � � = V 1 R 1 = R 1 so that V 1 is orthogonal and spans C n , and similarly define U 1 • Then we have where • We have implying that � as otherwise

  13. ������� : Every matrix A in C m � n has an SVD. • Set σ 1 = � A � � 2 . By compactness, we know there exists u 1 , v 1 satisfying � u 1 � = � v 1 � = 1 with A v 1 = σ 1 u 1 . • Assume without loss of generality that e � 1 v 1 � = 0 , and use the QR decomposition to factor � v 1 | e 2 | · · · | e n � v 1 | · · · | v n � � = V 1 R 1 = R 1 so that V 1 is orthogonal and spans C n , and similarly define U 1 • Then we have � σ 1 w � � � σ 1 u 1 | A v 2 | · · · A v n U � 1 AV 1 = U � � = 1 B where B � C m � 1 � n � 1 • We have implying that � as otherwise

  14. ������� : Every matrix A in C m � n has an SVD. • Set σ 1 = � A � � 2 . By compactness, we know there exists u 1 , v 1 satisfying � u 1 � = � v 1 � = 1 with A v 1 = σ 1 u 1 . • Assume without loss of generality that e � 1 v 1 � = 0 , and use the QR decomposition to factor � v 1 | e 2 | · · · | e n � v 1 | · · · | v n � � = V 1 R 1 = R 1 so that V 1 is orthogonal and spans C n , and similarly define U 1 • Then we have � σ 1 w � � � σ 1 u 1 | A v 2 | · · · A v n U � 1 AV 1 = U � � = 1 B where B � C m � 1 � n � 1 • We have � �� � �� w � � σ 1 � � σ 1 � σ 1 � 1 + w � w = � � � � σ 2 � � σ 2 1 + w � w � , � � � � B w w � � � implying that w = � as otherwise � A � � σ 2 1 + w � w > σ 1

  15. • Thus we have � σ 1 � U � 1 AV 1 = B • Assume (induction) that we have B = U 2 Σ 2 V � 2 • Thus • Induction and the fact that case is trivial completes the construction

  16. • Thus we have � σ 1 � U � 1 AV 1 = B • Assume (induction) that we have B = U 2 Σ 2 V � 2 • Thus � 1 � � σ 1 � � 1 � V � 1 = U Σ V � A = U 1 U 2 V � Σ 2 2 • Induction and the fact that 1 × 1 case is trivial completes the construction

  17. ������� : The σ j are uniquely determined, and if they are distinct, then U and V are uniquely determined up to sign.

  18. S OME PROPERTIES

  19. ��� ���� ��� ��� ��� ��� ���� ���� ��� ���� ������� : � A � � 2 = σ 1 � Q Σ V � u � = ��� � A u � = ��� � Σ v � = ��� σ j = σ 1 � u � =1 � u � =1 � v � =1 ������� : The rank of is , the number of non-zero singular values ������� : ����� and ���� ������� : If is square, ���

  20. ��� ��� ��� ��� ��� ��� ���� ���� ������� : � A � � 2 = σ 1 � Q Σ V � u � = ��� � A u � = ��� � Σ v � = ��� σ j = σ 1 � u � =1 � u � =1 � v � =1 ������� : The rank of A is r , the number of non-zero singular values ���� A = ���� Σ = r ������� : ����� and ���� ������� : If is square, ���

  21. ��� ��� ��� ��� ��� ��� ������� : � A � � 2 = σ 1 � Q Σ V � u � = ��� � A u � = ��� � Σ v � = ��� σ j = σ 1 � u � =1 � u � =1 � v � =1 ������� : The rank of A is r , the number of non-zero singular values ���� A = ���� Σ = r ������� : ����� ( A ) = ���� { u 1 , . . . , u r } and ���� ( A ) = ���� { v r +1 , . . . , v n } ������� : If is square, ��� �

  22. ��� ������� : � A � � 2 = σ 1 � Q Σ V � u � = ��� � A u � = ��� � Σ v � = ��� σ j = σ 1 � u � =1 � u � =1 � v � =1 ������� : The rank of A is r , the number of non-zero singular values ���� A = ���� Σ = r ������� : ����� ( A ) = ���� { u 1 , . . . , u r } and ���� ( A ) = ���� { v r +1 , . . . , v n } ������� : If A is square, | ��� A | = � n k =1 σ k ��� A = ��� U ��� Σ ��� V = ± ��� Σ

  23. ������� : The non-zerro singular values of A are the square roots of the non-zero eigenvalues of A � A or AA � A � A = ( U Σ V � ) � U Σ V � = V Σ � U � U Σ V � = V Σ � Σ V �

Recommend


More recommend