a review of linear algebra
play

A Review of Linear Algebra Mohammad Emtiyaz Khan CS,UBC A Review - PowerPoint PPT Presentation

A Review of Linear Algebra Mohammad Emtiyaz Khan CS,UBC A Review of Linear Algebra p.1/13 Basics Column vector x R n , Row vector x T , Matrix A R m n . Matrix Multiplication, ( m n )( n k ) m k , AB = BA .


  1. A Review of Linear Algebra Mohammad Emtiyaz Khan CS,UBC A Review of Linear Algebra – p.1/13

  2. Basics Column vector x ∈ R n , Row vector x T , Matrix A ∈ R m × n . Matrix Multiplication, ( m × n )( n × k ) ⇒ m × k , AB � = BA . Transpose A T , ( AB ) T = B T A T , Symmetric A = A T Inverse A − 1 , doesn’t exist always, ( AB ) − 1 = B − 1 A − 1 . x T x is a scalar, xx T is a matrix. A x = b , three ways of expressing: � n j =1 a ij x j = b j , ∀ j j x = b j , ∀ j , where r j is j th row. r T x 1 a 1 + x 2 a 2 + . . . + x n a n = b (Linear Combination, l.c.) System of equations : Non-singular (unique solution), singular (no solution, infinite solution). A Review of Linear Algebra – p.2/13

  3. LU factorization       2 1 1 x 1 5        ⇒ A x = b 4 − 6 0 x 2  = − 2     − 2 7 2 x 3 9       2 1 1 x 1 5        ⇒ U x = EFG b − 8 − 2  = − 12 0 x 2     0 0 1 x 3 2 � �� � U       1 1 1       L = G − 1 F − 1 E − 1 − 2 , 1 1 1       1 1 1 1 1 1 � �� � � �� � � �� � E F G A Review of Linear Algebra – p.3/13

  4. LU factorization (First non-singular case) If no row exchanges are required, then A = LU (unique). Solve L c = b , then U x = c Another form A = LDU . (Second non-singular case) There exist a permutation matrix P that reorders the rows, so that PA = LU . (Singular Case) No such P exist. (Cholesky Decomposition) If A is symmetric, and A = LU can be found without any row exchanges, then A = LL T (also called square root of a matrix). (proof). Positive Definite matrix always have a Cholesky decompostion. A Review of Linear Algebra – p.4/13

  5. Vector Space, Subspace and Matrix (Real Vector Space) A set of “vectors" with rules for vector addition and multiplication by real numbers. E.g. R 1 , R 2 , . . . , R ∞ , Hilbert Space. (8 conditions) Includes an identity vector and zero vector, closed under addition and multiplication etc. etc. (Subspace) Subset of a vector space, closed under addition and multiplication (should contain zero). Subspace “spanned" by a matrix (Outline the concept)       1 0 b 1       x 1  + x 2  = 5 4 b 2     2 4 b 3 A Review of Linear Algebra – p.5/13

  6. Linear Independence, Basis, Dimension (Linear Independence, l.i.) If x 1 a 1 + x 2 a 2 + . . . + x n a n only happens when x 1 = x 2 = . . . = 0 , { a k } are called linearly independent. A set of n vectors in R m are not l.i. if n > m (proof). (Span) If every vector v in V can be expressed as a l.c. of { a k } , then { a k } are said to span V . (Basis) { a k } are called basis of V if they are l.i. and span V (Too many and unique) (Dimension) Number of vectors in any basis is called dimension (and is same for all basis). A Review of Linear Algebra – p.6/13

  7. Four Fundamental Spaces Fundamental Theorem of Linear Algebra I 1. R ( A ) = Column Space of A ; l.c. of columns; dim r . 2. N ( A ) = Nullspace of A ; All x : A x = 0 ; dim n − r . 3. R ( A T ) = Row space of A ; l.c. of rows; dim r . 4. N ( A T ) = Left nullspace of A ; All y : A T y = 0 ; dim m − r . (Rank) r is called rank of the matrix. Inverse exist iff rank is as large as possible. Question: Rank of uv T A Review of Linear Algebra – p.7/13

  8. Orthogonality (Norm) || x || 2 = x T x = x 2 1 + . . . + x 2 n (Inner Product) x T y = x 1 y 1 + . . . + x n y n (Orthogonal) x T y = 0 Orthogonal ⇒ l.i. (proof). (Orthonormal basis) Orthogonal vectors with norm =1 (Orthogonal Subspaces) V ⊥ W if v ⊥ w, ∀ v ∈ V, w ∈ W (Orthogonal Complement) The space of all vectors orthogonal to V denoted as V ⊥ . The row space is orthogonal to the nullspace (in R n ) and the column space is orthogonal to the left nullspace (in R m ).(proof). A Review of Linear Algebra – p.8/13

  9. Finally... Fundamental Theorem of Linear Algebra II 1. R ( A T ) ⊥ = N ( A ) 2. R ( A ) ⊥ = N ( A T ) Any vector can be expressed as = x 1 b 1 + . . . + x r b r + x r +1 b r +1 + . . . + x n b n x (1) � �� � � �� � x r x n = x r + x n (2) Every matrix transforms its row space to its column space (Comments about pseudo-inverse and invertibility) A Review of Linear Algebra – p.9/13

  10. Gram-Schmidt Orthogonalization (Projection) of b on a is a T b a T a a , for unit vector ( a T b ) a (Schwartz Inequality) | a T b | ≤ || a |||| b || (Orthogonal Matrix) Q = [ q 1 . . . q n ] , Q T Q = I . (proof). (Length preservation) || Q x || = || x || (proof). Given vectors { a k } , construct orthogonal vectors { q k } 1. q 1 = a 1 / || a 1 || ′ j = a j − ( q T 1 a j ) q 1 − . . . − ( q T 2. for each j , a j − 1 a j ) q j − 1 ′ ′ 3. q j = a j / || a j || QR Decomposition (Example) A Review of Linear Algebra – p.10/13

  11. Eigenvalues and Eigenvectors (Invariance) A x = λ x . (Characteristics Equation) ( A − λI ) x = 0 (Nullspace) λ 1 + . . . + λ n = a 11 + . . . + a nn . λ 1 . . . λ n = det ( A ) . ( A = S Λ S − 1 ) Suppose there exist n linear independent eigenvectors for A. If S is the matrix whose columns are those independent vectors, then A = S Λ S − 1 where Λ = diag ( λ 1 , . . . , λ n ) . Diagonalizability is concerned with eigenvectors, and invertibility is concerned with eigenvalues. (Real symmetric matrix) Eigenvectors are orthogonal. So A = Q Λ Q T . (Spectral Theorem) A Review of Linear Algebra – p.11/13

  12. Singular Value Decomposition Any matrix can be factorized as A = U Σ V T . Insightful? Fin- ish. A Review of Linear Algebra – p.12/13

  13. Finish Thanks to Maria (Marisol Flores Gorrido) for helping me with this tutorial. A Review of Linear Algebra – p.13/13

Recommend


More recommend