section 7 1
play

Section 7.1 Diagonalization of symmetric matrices Motivation: - PowerPoint PPT Presentation

Section 7.1 Diagonalization of symmetric matrices Motivation: Diagonalization How did we recognize diagonalizable matrices? They are already diagonal They have n distinct eigenvalues Quick to check: only if matrix is triangular The


  1. Section 7.1 Diagonalization of symmetric matrices

  2. Motivation: Diagonalization How did we recognize diagonalizable matrices? ◮ They are already diagonal ◮ They have n distinct eigenvalues Quick to check: only if matrix is triangular ◮ The algebraic and geometric multiplicities are equal for all eigenvalues and they sum up to n . New criterion: Verify if matrix is symmetric ! � 3 � 1 ◮ Symmetric, e.g. 1 2   � 0 1 − 4 0 � 1 ◮ Not symmetric, e.g. 6 1 − 4 ,   − 1 0 0 6 1

  3. Warm up: u T u vs uu T If u is a vector in R n with entries u T = ( u 1 , u 2 , . . . , u n ), then ◮ u T u = u 2 1 + u 2 2 + · · · u 2 n is a scalar . ◮ uu T is an n × n matrix :   u 1 u 1 u 1 u 2 · · · u 1 u n u 2 u 1 u 2 u 2 · · · u 2 u n   uu T =   . . . ...   . . . . . .   u n u 1 u n u 2 · · · u n u n A projection matrix! In fact, uu T is the standard matrix for the transformation T : R n → R n that projects onto the line spanned by u .

  4. Warm up: Inverse of an orthonormal matrix For orthogonal matrices Q , with column vectors u 1 , u 2 , . . . , u n we already know that   0 0 u 1 · u 1 · · · 0 u 2 · u 2 · · · 0     . . . Q T Q = ...  . . .  . . .     . . . . . . . . . u n · u n so for orthonormal matrices Q   1 0 · · · 0 0 1 0 · · ·     Q T Q = . . . ...   . . . . . .     . . . . . . . . . 1 What is the inverse of Q ?

  5. Orthogonally diagonalizable Definition An n × n matrix A is orthogonally diagonalizable if A = PDP − 1 with D diagonal matrix and P an orthonormal matrix. To stress the orthogonality of P we write A = PDP T . Avoiding errors Computations using orthogonal matrices usually prevents numer- ical errors from accumulating.

  6. Collection of eigenvalues = ‘Spectral’ Spectral decomposition If D has diagonal entries λ 1 , . . . , λ n and P has columns u 1 , . . . , u n then A = λ 1 u 1 u T 1 + λ 2 u 2 u T 2 + · · · + λ n u n u T n ◮ Fancy way of expressing the change of variables and ◮ the fact that principal axes are only stretch/contracted ◮ Each of u i u T is a projection matrix! i Why? We have to name each entry of the vectors u 1 , . . . , u n . 1. Say u T k = ( u k 1 , u k 2 , . . . u kn ). 2. Start with a simple case : λ 1 = λ 2 = · · · = λ n = 1 ◮ Compare the ( i , j )-th entry of u k u T k : u ki u kj ◮ with the ( i , j )-th entry of PP T : � n k =1 u ki u kj 3. Challenge: If the λ ’s are different , how the entries of PDP T change?

  7. Poll Paper-based Poll In a piece of paper with your name , hand to the instructor:   u 11 u 21 u 31 u 41 u 12 u 22 u 32 u 42   If P =   u 13 u 23 u 33 u 43   u 14 u 24 u 34 u 44 Write down P T and compute ◮ the (2 , 3)-th entry of PP T   u 11 u 12   � � ◮ the (2 , 3)-th entry of u 11 u 12 u 13 u 14   u 13   u 14   u 21 u 22   � � ◮ the (2 , 3)-th entry of u 21 u 22 u 23 u 24   u 23   u 24

  8. Example: Orthogonally diagonalizable Example   3 − 2 4 Orthogonally diagonalize the matrix A = − 2 6 2   4 2 3 its charactheristic equation is − ( λ − 7) 2 ( λ + 2) = 0. Find a basis for each λ -eigenspace:           1 − 1 / 2 − 1      , For λ = 7: 0 1 For λ = − 2: − 1 / 2          1 0 1 A suitable P Is the set of eigenvectors above already orthogonal? orthonormal?   7 0 0  P − 1 A = P 0 7 0  0 0 − 2

  9. Example: Orthogonally diagonalizable continued Verify:       − 1 1 − 1 / 2  is already orthogonal to v 1 =  and v 2 = ◮ v 3 = − 1 / 2 0 1     1 1 0 ◮ but v 1 · v 2 � = 0. Tackle this: Use Gram-Schmidt u 1 = v 1       − 1 / 2 1 − 1 / 4 u 2 = v 2 − v 2 · v 1  − − 1 / 2  = v 1 · v 1 v 1 = 1 0 1     2 0 1 1 / 4 And u 3 = v 3 . Then normalize! √ √     1 / 2 − 1 18 − 2 / 3 7 0 0 √  , P = 0 4 18 − 1 / 3 D = 0 7 0    √ √ 1 / 2 1 18 2 / 3 0 0 − 2

  10. Example: Spectral Decomposition Example Construct a spectral decomposition of the matrix A with orthogonal diagonalization � 7 � 2 / √ √ � � 8 � � 2 / √ √ � � 2 5 − 1 / 5 0 5 1 / 5 √ √ √ √ A = = 2 4 0 3 1 / 5 2 / 5 − 1 / 5 2 / 5 Solution: Then A = 8 u 1 u T 1 + 3 u 2 u T 2 , each matrix is � 4 / 5 � 2 / 5 u 1 u T 1 = 2 / 5 1 / 5 � 1 / 5 � − 2 / 5 u 2 u T 2 = − 2 / 5 4 / 5 � 32 / 5 � 3 / 5 � � 16 / 5 − 6 / 5 Check: 8 u 1 u T 1 + 3 u 2 u T 2 = + = A 16 / 5 8 / 5 − 6 / 5 12 / 5

  11. Symmetric matrices Definition An n × n matrix is symmetric if A = A T . Theorem An n × n matrix A is orthogonally diagonalizable if and only if A is symmetric . The easy observation: Let A = PDP T with D diagonal and P orthonormal. Just check A is symmetric, that is A = A T : � �� � ) = ( P T ) T D T P T = PDP T ( PDP T � �� � AAAAAAAAAAAAAAAAAAA The difficult part (omitted here) is: if A = A T then an orthogonal diagonalization do exists .

  12. Summary Spectral Theorem for Symmetric matrices An n × n symmetric matrix A has the following properties . ◮ A has n real eigenvalues , counting multiplicities ◮ For each eigenvalue, the dimension of the λ -eigenspaces equal the algebraic multiplicity. ◮ The eigenspaces are mutually orthogonal! eigenvectors corresponding to different eigenvalues are orthogonal. ◮ A is orthogonally diagonalizable .

  13. Extra: Eigenspaces are mutually orthogonal Symmetric matrices only Eigenspaces are mutually orthogonal What does it mean? If v 1 and v 2 are eixgenvectors that correspond to distinct eigenvalues λ 1 and λ 2 then v 1 · v 2 = 0. Trick to see this: Find a way to show that ( λ 1 − λ 2 ) v 1 · v 2 = 0. Why? We assumed that λ 1 � = λ 2 so necessarily v 1 · v 2 = 0. Hint: Compute v T 1 Av 2 in two different ‘orders’ ◮ Symmetry is important: You’ll have to sustitute A = A T at some point.

Recommend


More recommend