Math 221: LINEAR ALGEBRA §2-2. Equations, Matrices, and Transformations Le Chen 1 Emory University, 2020 Fall (last updated on 09/02/2020) Creative Commons License (CC BY-NC-SA) 1 Slides are adapted from those by Karen Seyffarth from University of Calgary.
Vectors Definitions A row matrix or column matrix is often called a vector, and such matrices are referred to as row vectors and column vectors, respectively. If � x is a row vector of size 1 × n, and � y is a column vector of size m × 1 , then we write y 1 y 2 � � x = x 1 x 2 x n and y = � � . · · · . . y m
Vector form of a system of linear equations Definition Consider the system of linear equations a 11 x 1 + a 12 x 2 + + a 1 n x n = b 1 · · · a 21 x 1 + a 22 x 2 + + a 2 n x n = b 2 · · · . . . . . . . . . . . . a m 1 x 1 + a m 2 x 2 + + a mn x n = b m · · ·
Vector form of a system of linear equations Definition Consider the system of linear equations a 11 x 1 + a 12 x 2 + + a 1 n x n = b 1 · · · a 21 x 1 + a 22 x 2 + + a 2 n x n = b 2 · · · . . . . . . . . . . . . a m 1 x 1 + a m 2 x 2 + + a mn x n = b m · · · Such a system can be expressed in vector form or as a vector equation by using linear combinations of column vectors: a 11 a 12 a 1 n b 1 a 21 a 22 a 2 n b 2 x 1 + x 2 + · · · + x n = . . . . . . . . . . . . a m 1 a m 2 a mn b m
Vector form of a system of linear equations Problem Express the following system of linear equations in vector form: 2 x 1 + 4 x 2 3 x 3 = − 6 − x 2 + 5 x 3 = 0 − x 1 + x 2 + 4 x 3 = 1
Vector form of a system of linear equations Problem Express the following system of linear equations in vector form: 2 x 1 + 4 x 2 3 x 3 = − 6 − x 2 + 5 x 3 = 0 − x 1 + x 2 + 4 x 3 = 1 Solution 2 4 − 3 − 6 + x 2 + x 3 = x 1 0 − 1 5 0 1 1 4 1
Matrix vector multiplication Definition Let A = [ a ij ] be an m × n matrix with columns � a 1 ,� a 2 , . . . ,� a n , written � � A = � a 1 � a 2 � a n , and let � x be an n × 1 column vector, · · · x 1 x 2 x = � . . . x n
Matrix vector multiplication Definition Let A = [ a ij ] be an m × n matrix with columns � a 1 ,� a 2 , . . . ,� a n , written � � A = � a 1 � a 2 � a n , and let � x be an n × 1 column vector, · · · x 1 x 2 x = � . . . x n Then the product of matrix A and (column) vector � x is the m × 1 column vector given by x 1 x 2 n � � � � a 1 � a 2 a n � = x 1 � a 1 + x 2 � a 2 + · · · + x n � a n = x j � a j . · · · . . j =1 x n that is, A � x is a linear combination of the columns of A.
Problem Compute the product A � x for � 1 � 2 � � 4 A = and � x = 5 0 3
Problem Compute the product A � x for � 1 � 2 � � 4 A = and � x = 5 0 3 Solution � 1 � � 2 � 1 � 4 � 12 � 14 � � � � � � � 4 2 A � x = = 2 + 3 = + = 5 0 3 5 0 10 0 10
Problem Compute A � y for 2 1 0 2 − 1 − 1 A = 2 − 1 0 1 and � y = 1 3 1 3 1 4
Problem Compute A � y for 2 1 0 2 − 1 − 1 A = 2 − 1 0 1 and � y = 1 3 1 3 1 4 Solution 1 0 2 − 1 0 + ( − 1) + 1 + 4 = A � y = 2 2 − 1 0 1 9 3 1 3 1 12
Matrix form of a system of linear equations Definition Consider the system of linear equations a 11 x 1 + a 12 x 2 + · · · + a 1 n x n = b 1 a 21 x 1 + a 22 x 2 + · · · + a 2 n x n = b 2 . . . . . . . . . a m 1 x 1 + a m 2 x 2 + + a mn x n = b m · · · Such a system can be expressed in matrix form using matrix vector multiplication, a 11 a 12 a 1 n x 1 b 1 · · · a 21 a 22 a 2 n x 2 b 2 · · · = . . . . . . . . . . . . . . . a m 1 a m 2 a mn x n b m · · ·
Matrix form of a system of linear equations Definition Consider the system of linear equations a 11 x 1 + a 12 x 2 + · · · + a 1 n x n = b 1 a 21 x 1 + a 22 x 2 + · · · + a 2 n x n = b 2 . . . . . . . . . a m 1 x 1 + a m 2 x 2 + + a mn x n = b m · · · Such a system can be expressed in matrix form using matrix vector multiplication, a 11 a 12 a 1 n x 1 b 1 · · · a 21 a 22 a 2 n x 2 b 2 · · · = . . . . . . . . . . . . . . . a m 1 a m 2 a mn x n b m · · · Thus a system of linear equations can be expressed as a matrix equation x = � A � b , where A is the coefficient matrix, � b is the constant matrix, and � x is the matrix of variables.
Problem Express the following system of linear equations in matrix form. 2 x 1 + 4 x 2 3 x 3 = − 6 − x 2 + 5 x 3 = 0 − x 1 + x 2 + 4 x 3 = 1
Problem Express the following system of linear equations in matrix form. 2 x 1 + 4 x 2 3 x 3 = − 6 − x 2 + 5 x 3 = 0 − x 1 + x 2 + 4 x 3 = 1 Solution 2 4 − 3 x 1 − 6 = 0 − 1 5 x 2 0 1 1 4 x 3 1
Theorem 1. Every system of m linear equations in n variables can be written in the x = � form A � b where A is the coefficient matrix, � x is the matrix of variables, and � b is the constant matrix.
Theorem (continued) x = � 2. The system A � b is consistent (i.e., has at least one solution) if and only if � b is a linear combination of the columns of A.
Theorem (continued) x 1 x 2 x = � 3. The vector � x = is a solution to the system A � b if and only . . . x n if x 1 , x 2 , . . . , x n are a solution to the vector equation a n = � x 1 � a 1 + x 2 � a 2 + · · · x n � b where � a 1 ,� a 2 , . . . ,� a n are the columns of A.
Problem Let 1 0 2 − 1 1 � A = 2 − 1 0 1 and b = 1 3 1 3 1 1 Express � b as a linear combination of the columns � a 1 ,� a 2 ,� a 3 ,� a 4 of A, or show that this is impossible.
Problem Let 1 0 2 − 1 1 � A = 2 − 1 0 1 and b = 1 3 1 3 1 1 Express � b as a linear combination of the columns � a 1 ,� a 2 ,� a 3 ,� a 4 of A, or show that this is impossible. Solution x = � Solve the system A � b where � x is a column vector with four entries.
Problem Let 1 0 2 − 1 1 � A = 2 − 1 0 1 and b = 1 3 1 3 1 1 Express � b as a linear combination of the columns � a 1 ,� a 2 ,� a 3 ,� a 4 of A, or show that this is impossible. Solution x = � Solve the system A � b where � x is a column vector with four entries. Do so by � � � putting the augmented matrix A b in reduced row-echelon form.
Problem Let 1 0 2 − 1 1 � A = 2 − 1 0 1 and b = 1 3 1 3 1 1 Express � b as a linear combination of the columns � a 1 ,� a 2 ,� a 3 ,� a 4 of A, or show that this is impossible. Solution x = � Solve the system A � b where � x is a column vector with four entries. Do so by � � � putting the augmented matrix A b in reduced row-echelon form. 1 0 2 − 1 1 1 0 0 1 1/7 → · · · → 2 − 1 0 1 1 0 1 0 1 − 5/7 3 1 3 1 1 0 0 1 − 1 3/7
Problem Let 1 0 2 − 1 1 � A = 2 − 1 0 1 and b = 1 3 1 3 1 1 Express � b as a linear combination of the columns � a 1 ,� a 2 ,� a 3 ,� a 4 of A, or show that this is impossible. Solution x = � Solve the system A � b where � x is a column vector with four entries. Do so by � � � putting the augmented matrix A b in reduced row-echelon form. 1 0 2 − 1 1 1 0 0 1 1/7 → · · · → 2 − 1 0 1 1 0 1 0 1 − 5/7 3 1 3 1 1 0 0 1 − 1 3/7 Since there are infinitely many solutions (x 4 is assigned a parameter), choose any value for x 4 .
Recommend
More recommend