Álgebra Linear e Aplicações
MATRIX ALGEBRA
Basic definitions • A scalar is complex number (unless specified) • n -tuples of complex numbers denoted by C n • n -tuples of real numbers denoted by R n ( R 2 is the space of points in the plane etc.) • R m × n and C m × n denote the set of real and complex m × n matrices, respectively • Equality A = B • Iff [ A ] ij = [ B ] ij and same shape! ✓ 1 ◆ � 1 2 � v = 6 = u = 2
Addition • Addition A + B • [ A + B ] ij = [ A ] ij + [ B ] ij • Same shape • Matrix addition vs. scalar addition ✓ 2 ✓ − 2 ◆ ◆ ✓ 0 ◆ 3 1 − x − 2 1 1 x + = z + 3 4 − 3 4 + x 4 + y 8 + x 4 − y z • Addition inverse – A • [– A ] ij = –[ A ] ij • Matrix difference A – B • [ A – B ] ij = [ A ] ij – [ B ] ij • Same shape!
Addition properties • Let A , B , and C be m × n matrices • Closure: A + B is still an m × n matrix • Associative: ( A + B ) + C = A + ( B + C ) • Commutative: A + B = B + A • Additive identity: [ 0 ] ij = 0 ⇒ A + 0 = A • Additive inverse: A + (– A ) = 0
Scalar multiplication and properties • Scalar multiplication α A • [ α A ] ij = α [ A ] ij • Closure: is still an m × n matrix α A • Associative: ( αβ ) A = α ( β A ) • Distributive: α ( A + B ) = α A + α B • Distributive: ( α + β ) A = α A + β A • Identity: 1 A = A • Where 1 is a scalar !
Transpose of a matrix • Transpose A T [ A T ] ij = [ A ] ji • 0 1 1 4 ◆ T ✓ 1 2 3 = 2 5 • A is m × n , A T is n × m @ A 4 5 6 3 6 • Conjugate matrix ¯ A [ ¯ • A ] ij = [ A ] ji • Conjugate transpose A ∗ = ¯ A T = A T • Also called adjoint 0 1 1 + 4 i 3 ◆ ∗ ✓ 1 − 4 i 2 i = − i 2 − i @ A 3 2 + i 0 2 0
Properties of transpose A T � T = A A ∗ � ∗ = A � � ( A + B ) ∗ = A ∗ + B ∗ ( A + B ) T = A T + B T ( α A ) T = α A T ( α A ) ∗ = ¯ α A ∗ T ⇤ ⇥ ( α A ) ∗ ⇤ ⇥ ij = ( α A ) ij = [ α A ] ji = [ α A ] ji α [ ¯ = α [ A ] ji = ¯ A ] ji T ] ij = ¯ α [( ¯ = ¯ A ) α [ A ∗ ] ij
Symmetries • Let A = [ a ij ] be a matrix • A is a symmetric matrix if A = A T • a ij = a ji • A is a skew-symmetric matrix if A = – A T • a ij = − a ji • A is a Hermitian matrix if A = A * • a ij = a ji • Complex analogous of symmetry • A is a skew-Hermitian matrix if A = – A * • a ij = − a ji
Example of symmetry in action • Hooke’s law • Force exerted by a spring is proportional to displacement from rest position. • f = – k x Node 1 k 1 Node 2 k 2 Node 3 x 1 x 2 x 3 F 1 -F 1 -F 3 F 3
The stiffness matrix Node 1 k 1 Node 2 k 2 Node 3 x 1 x 2 x 3 F 1 -F 1 -F 3 F 3 F 1 = k 1 ( x 1 − x 2 ) k 1 − k 1 0 F 3 = k 2 ( x 3 + x 2 ) K = − k 1 k 1 + k 2 − k 2 − k 2 k 2 0 F 2 = − F 1 − F 3 = F 1 k 1 x 1 − k 1 x 2 − k 1 x 1 + ( k 1 + k 2 ) x 2 − k 2 x 3 = F 2 − k 2 x 2 + k 2 x 3 = F 3
LINEARITY
Definition of linear function • Let D and R be sets that possess the operations addition and product by scalar • A function f from D to R is said to be a linear function iff f ( x + y ) = f ( x ) + f ( y ) f ( α x ) = α f ( x ) • Equivalently, iff f ( α x + y ) = α f ( x ) + f ( y )
More explicitly • Lines or planes through the origin y = f ( x ) = α x z = f ( x, y ) = α x + β y • Not going through the origin? • Not a linear function, but an affine function y = f ( x ) = α x + β z = f ( x, y ) = α x + β y + γ • In higher dimensions f ( x 1 , x 2 , . . . , x n ) = α 1 x 1 + α 2 x 2 + · · · + α n x n • Harder to visualize, hyper-planes through the origin
Linearity in other contexts • Derivative operator d ( α f + g ) = α d dx + dg f dx dx • Integral operator Z Z Z ( α f + g ) dx = α f dx + g dx • Matrix transpose operator ( α A + B ) T = α A T + B T
Linear systems and linear functions • Think of the linear system a 11 x 1 + a 12 x 2 + · · · a 1 n x n = u 1 a 21 x 1 + a 22 x 2 + · · · a 2 n x n = u 2 . . . a m 1 x 1 + a m 2 x 2 + · · · a mn x n = u m • as an equation f ( x ) = u from R n to R m where x 1 u 1 n , , and • x 2 u 2 X [ f ( x )] i = a ij [ x ] j = [ u ] i x = u = . . . . . . j =1 x n u m
Linear systems and linear functions n • Is f linear? X [ f ( x )] i = a ij [ x ] j = [ u ] i j =1 n n X X [ f ( α x + y )] i = a ij [ α x + y ] j = a ij ( α [ x ] j + [ y ] j ) j =1 j =1 n n X X = α a ij [ x ] j + a ij [ y ] j = [ α f ( x ) + f ( y )] i j =1 j =1 • It is linear • Solving a linear system is the inverse of applying a linear function • What is the x such that f ( x ) = u ?
Linear functions ( R n to R m ) and matrices x 1 1 0 0 x 2 • Let and 0 1 0 e 1 = e 2 = e n = . . . x = · · · , · · · , . . . , . . . 0 0 1 x n 0 1 n n n • Then and X X X A = f ( x ) = f x j e j x j f ( e j ) x = x j e j @ j =1 j =1 j =1 a 1 j • Now organize all in a matrix a 2 j f ( e j ) = . . . a mj 0 1 a 11 a 12 a 1 n · · · a 21 a 22 a 2 n · · · B C � f ( e 1 ) f ( e n ) � A = A = f ( e 2 ) . . . B C · · · . . . B C . . . @ a m 1 a m 2 a mn · · ·
MATRIX MULTIPLICATION
What about matrix multiplication? • An m × n matrix is a neat way to specify a linear function from R n to R m • Our definitions of addition and multiplication by scalar work well with this interpretation � � ( f + g )( x ) = f ( x ) + g ( x ) ( α f )( x ) = α f ( x ) • How would you define matrix multiplication? • Point-wise definition is not very useful ( fg )( x ) = f ( x ) g ( x ) • ( f g ) is not even linear anymore! • Matrix element-wise multiplication not good either.
“Multiplication” as composition • Proposed by Arthur Cayley in 1855 • Very recent! • Linear functions are closed under composition � � � � ( g � f )( α x + y ) = g f ( α x + y ) = g α f ( x ) + f ( y ) � � � � = α g f ( x ) + g f ( y ) = α ( g � f )( x ) + ( g � f )( y ) • And, composition is associative • This is promising
Potential issues • Function composition is not commutative g � f 6 = f � g • So matrix multiplication not commutative either • Not all function compositions make sense f : A → B g : B → C • Matrices must be compatible • An m × n matrix is a function from R n to R m • Product between m × n and n × p matrices is OK • But what is the resulting matrix?
Matrix multiplication • Let g : A = [ a ij ] m × n and f : B = [ b jk ] n × p n p X X ⇥ ⇤ ⇥ ⇤ g ( y ) i = a ij [ y ] j f ( x ) j = b jk [ x ] k j =1 k =1 n • Consider g ∘ f : C = [ c ik ] m × p X c ik = a ij b jk j =1 n h �i X ⇥ ⇤ ⇤ � i = f ( x ) a ij [( g � f )( x ) i = f ( x ) g j j =1 p p n n X X X X = b jk [ x ] k = a ij b jk [ x ] k a ij j =1 j =1 k =1 k =1 0 1 p p n X X A [ x ] k X = = c ik [ x ] k a ij b jk @ k =1 j =1 k =1
Step by step multiplication • A = [ a ij ] m × n , B = [ b jk ] n × p , C = AB = [ c ik ] m × p n X c ij = a ik b kj k =1 p p n = c ij · · · a ik · · · m . m . . b kj n . . .
Example #1 • Multiplication of a row and a column c 1 c 2 r T = � r 1 � r 2 r n c = · · · . . . c n n X r T c = r 1 c 1 + r 2 c 2 + · · · + r n c n = r i c i i =1 • Result is a scalar value • Also known as the scalar product of r and c • Also standard inner product , or dot product
More insight into multiplication • A = [ a ij ] m × n , B = [ b jk ] n × p , C = AB = [ c ik ] m × p n X c ij = a ik b kj = A i ∗ B ∗ j k =1 = c ij · · · a ik · · · . . . b kj . . .
Example #2 • Multiplication of a column and a row c 1 r T = � r 1 � c 2 r 2 r n · · · c = . . . c 1 r 1 c 1 r 2 c 1 r n · · · c m c 2 r 1 c 2 r 2 c 2 r n · · · c r T = . . . . . . . . . c m r 1 c m r 2 c m r n · · · • Result is an m × n matrix • Also known as the outer product of c and r • What is the rank of the result?
Example #3 • Two matrices ✓ α ◆ ✓ a ◆ β b c G = F = γ δ d e f ✓ α a + β d ◆ α b + β e α c + β f H = FG = γ a + δ d γ b + δ e γ c + δ f
How expensive is multiplication? • Let A = [ a ij ] m × n , B = [ b jk ] n × p , C = AB = [ c ik ] m × p n • X c ij = a ik b kj k =1 • n multiplications and ( n –1) additions per c ij • There are m × p entries c ij • Total cost is • m × n × p multiplications • m × ( n -1) × p additions • Or n 3 for two square matrices of size n
Recommend
More recommend