The Eigenvector [12] The Eigenvector
Two interest-bearing accounts Suppose Account 1 yields 5% interest and Account 2 yields 3% interest. � amount in Account 1 � Represent balances in the two accounts by a 2-vector x ( t ) = . amount in Account 2 � 1 . 05 � 0 x ( t +1) = x ( t ) 0 1 . 03 Let A denote the matrix. It is diagonal. To find out how, say, x (100) compares to x (0) , we can use Equation repeatedly: x (100) A x (99) = A ( A x (98) ) = . . . x (0) = A · A · · · · A � �� � 100 times A 100 x (0) =
Two interest-bearing accounts x (100) A x (99) = A ( A x (98) ) = . . . x (0) = A · A · · · · A � �� � 100 times A 100 x (0) = Since A is a diagonal matrix, easy to compute powers of A :
Two interest-bearing accounts x (100) A x (99) = A ( A x (98) ) = . . . x (0) = A · A · · · · A � �� � 100 times A 100 x (0) = Since A is a diagonal matrix, easy to compute powers of A : � 1 . 05 � � 1 . 05 � 1 . 05 2 � � 0 0 0 = 1 . 03 2 0 1 . 03 0 1 . 03 0
Two interest-bearing accounts x (100) A x (99) = A ( A x (98) ) = . . . x (0) = A · A · · · · A � �� � 100 times A 100 x (0) = Since A is a diagonal matrix, easy to compute powers of A : � 1 . 05 � 1 . 05 � 1 . 05 100 � 131 . 5 � � � � 0 0 0 0 · · · = ≈ 1 . 03 100 0 1 . 03 0 1 . 03 0 0 19 . 2 � �� � 100 times The takeaway: � Account 1 balance after t years � 1 . 05 t · (initial Account 1 balance) � � = 1 . 03 t · (initial Account 2 balance) Account 2 balance after t years
Rabbit reproduction Time 0 To avoid getting into trouble, I’ll pretend sex doesn’t exist. Time 1 ◮ Each month, each adult rabbit gives birth to one baby. Time 2 ◮ A rabbit takes one month to become an adult. Time 3 ◮ Rabbits never die. Time 4 � a 11 � � � � � adults at time t + 1 a 12 adults at time t = juveniles at time t + 1 a 21 a 22 juveniles at time t � �� � A � � number of adults after t months Use x ( t ) = number of juveniles after t months � 1 � 1 Then x ( t +1) = A x ( t ) where A = . 1 0 [1 , 0] , [1 , 1] , [2 , 1] , [3 , 2] , [5 , 3] , [8 , 3] , . . .
Analyzing rabbit reproduction � 1 � 1 x ( t +1) = A x ( t ) where A = . 1 0 As in bank-account example, x ( t ) = A t x (0) . How can this help us calculate how the entries of x ( t ) grow as a function of t ? In the bank-account example, we were able to understand the behavior because A was a diagonal matrix. This time, A is not diagonal. However, there is a workaround: � � � √ � √ √ 1+ 5 1+ 5 1 − 5 0 . Then S − 1 AS is the diagonal matrix 2 Let S = 2 2 . √ 1 − 5 1 1 0 2 A t = A A · · · A � �� � t times ( S Λ S − 1 )( S Λ S − 1 ) · · · ( S Λ S − 1 ) = S Λ t S − 1 = Λ is a diagonal matrix ⇒ easy to compute Λ t . � λ 1 � λ t � √ � � � 1+ 5 then Λ t = 1 2 If Λ = . Here Λ = . √ λ t λ 2 1 − 5 2 2
Interpretation using change of basis Interpretation: To make the analysis easier, we will use a change of basis � � � � √ √ 1+ 5 1 − 5 Basis consists of the two columns of the matrix S , v 1 = , v 2 = 2 2 1 1 Let u ( t ) = coordinate representation of x ( t ) in terms of v 1 and v 2 . ◮ ( rep2vec ) To go from repres. u ( t ) to vector x ( t ) itself, we multiply u ( t ) by S . ◮ (Move forward one month) To go from x ( t ) to x ( t +1) , we multiply x ( t ) by A . ◮ ( vec2rep ) To go to coord. repres., we multiply by S − 1 . Multiplying by the matrix S − 1 AS carries out the three steps above. � √ � � √ � 1+ 5 1+ 5 0 0 so u ( t +1) = But S − 1 AS = Λ = u ( t ) 2 2 √ √ 1 − 5 1 − 5 0 0 2 2 so � √ � (1+ 5 ) t 0 u ( t ) = u (0) 2 √ ( 1 − 5 ) t 0 2
Eigenvalues and eigenvectors For this topic, consider only matrices A such that row-label set = col-label set ( endomorphic matrices). Definition: If λ is a scalar and v is a nonzero vector such that A v = λ v , we say that λ is an eigenvalue of A , and v is a corresponding eigenvector . Any nonzero vector in the eigenspace is considered an eigenvector. However, it is often convenient to require that the eigenvector have norm one. � 1 . 05 � 0 Example: has eigenvalues 1.05 and 1.03, and corresponding 0 1 . 03 eigenvectors [1 , 0] and [0 , 1]. � 1 � 1 √ √ has eigenvalues λ 1 = 1+ 5 and λ 2 = 1 − 5 Example: , and corresponding 2 2 1 0 √ √ eigenvectors [ 1+ 5 , 1] and [ 1 − 5 , 1]. 2 2 Example: What does it mean when A has 0 as an eigenvalue? There is a nonzero vector v such that A v = 0 v . That is, A ’s null space is nontrivial. Last example suggests a way to find an eigenvector corresp. to eigenvalue 0: find nonzero vector in the null space. What about other eigenvalues?
Eigenvector corresponding to an eigenvalue Suppose λ is an eigenvalue of A , with corresponding eigenvector v . Then A v = λ v . That is, A v − λ v is the zero vector.The expression A v − λ v can be written as ( A − λ 1 ) v ,so ( A − λ 1 ) v is the zero vector. That means that v is a nonzero vector in the null space of A − λ 1 . That means that A − λ 1 is not invertible. Conversely, suppose A − λ 1 is not invertible. It is square, so it must have a nontrivial null space. Let v be a nonzero vector in the null space. Then ( A − λ 1 ) v = 0 , so A v = λ v . We have proved the following: Lemma: Let A be a square matrix. ◮ The number λ is an eigenvalue of A if and only if A − λ 1 is not invertible. ◮ If λ is in fact an eigenvalue of A then the corresponding eigenspace is the null space of A − λ 1 . Corollary If λ is an eigenvalue of A then it is an eigenvalue of A T .
Similarity Definition: Two matrices A and B are similar if there is an invertible matrix S such that S − 1 AS = B . Proposition: Similar matrices have the same eigenvalues. Proof: Suppose λ is an eigenvalue of A and v is a corresponding eigenvector. By definition, A v = λ v . Suppose S − 1 AS = B , and let w = S − 1 v . Then S − 1 AS w B w = S − 1 ASS − 1 v = S − 1 A v = S − 1 λ v = λ S − 1 v = λ w = which shows that λ is an eigenvalue of B .
Example of similarity 6 3 − 9 are Example: We will see later that the eigenvalues of the matrix A = 0 9 15 0 0 15 its diagonal elements (6, 9, and 15) because U is upper triangular. The matrix 92 − 32 − 15 has the property that B = S − 1 AS where B = − 64 34 39 176 − 68 − 99 − 2 1 4 S = 1 − 2 1 . Therefore the eigenvalues of B are also 6, 9, and 15. − 4 3 5
Diagonalizability Definition: If A is similar to a diagonal matrix, i.e. if there is an invertible matrix S such that S − 1 AS = Λ where Λ is a diagonal matrix, we say A is diagonalizable . Equation S − 1 AS = Λ is equivalent to equation A = S Λ S − 1 , which is the form used in the analysis of rabbit population. How is diagonalizability related to eigenvalues? λ 1 ... ◮ Eigenvalues of a diagonal matrix Λ = are its diagonal entries. λ n ◮ If matrix A is similar to Λ then the eigenvalues of A are the eigenvalues of Λ ◮ Equation S − 1 AS = Λ is equivalent to AS = S Λ. Write S in terms of columns: λ 1 ... v 1 v 1 = v n v n A · · · · · · λ n ◮ The argument goes both ways: if n × n matrix A has n linearly independent eigenvectors then A is diagonalizable.
Diagonalizability Definition: If A is similar to a diagonal matrix, i.e. if there is an invertible matrix S such that S − 1 AS = Λ where Λ is a diagonal matrix, we say A is diagonalizable . Equation S − 1 AS = Λ is equivalent to equation A = S Λ S − 1 , which is the form used in the analysis of rabbit population. How is diagonalizability related to eigenvalues? λ 1 ... ◮ Eigenvalues of a diagonal matrix Λ = are its diagonal entries. λ n ◮ If matrix A is similar to Λ then the eigenvalues of A are the eigenvalues of Λ ◮ Equation S − 1 AS = Λ is equivalent to AS = S Λ. Write S in terms of columns: λ 1 ... v 1 A v 1 = A v n v n · · · · · · λ n ◮ The argument goes both ways: if n × n matrix A has n linearly independent eigenvectors then A is diagonalizable.
Diagonalizability Definition: If A is similar to a diagonal matrix, i.e. if there is an invertible matrix S such that S − 1 AS = Λ where Λ is a diagonal matrix, we say A is diagonalizable . Equation S − 1 AS = Λ is equivalent to equation A = S Λ S − 1 , which is the form used in the analysis of rabbit population. How is diagonalizability related to eigenvalues? λ 1 ... ◮ Eigenvalues of a diagonal matrix Λ = are its diagonal entries. λ n ◮ If matrix A is similar to Λ then the eigenvalues of A are the eigenvalues of Λ ◮ Equation S − 1 AS = Λ is equivalent to AS = S Λ. Write S in terms of columns: A v 1 = λ 1 v 1 A v n λ n v n · · · · · · Columns v 1 , . . . , v n of S are eigenvectors. Because S is invertible, the eigenvectors are linearly independent. ◮ The argument goes both ways: if n × n matrix A has n linearly independent eigenvectors then A is diagonalizable.
Recommend
More recommend