Try it yourself! Find a basis for the orthogonal complement in R 3 of the subspace spanned by the vector (1 , 2 , 3). This is the same as finding the kernel of the matrix [1 2 3] It’s already row reduced. The kernel is spanned by ( − 2 , 1 , 0) and ( − 3 , 0 , 1).
Orthogonal complements
Orthogonal complements (For notation, recall V was a subspace of R n .)
Orthogonal complements (For notation, recall V was a subspace of R n .) V ⊥ is the kernel of a matrix with dim V linearly independent rows and n columns,
Orthogonal complements (For notation, recall V was a subspace of R n .) V ⊥ is the kernel of a matrix with dim V linearly independent rows and n columns, so dim V ⊥ = n − dim V ,
Orthogonal complements (For notation, recall V was a subspace of R n .) V ⊥ is the kernel of a matrix with dim V linearly independent rows and n columns, so dim V ⊥ = n − dim V , i.e. dim V + dim V ⊥ = dim R n
Orthogonal complements (For notation, recall V was a subspace of R n .) V ⊥ is the kernel of a matrix with dim V linearly independent rows and n columns, so dim V ⊥ = n − dim V , i.e. dim V + dim V ⊥ = dim R n Recall from before, V ∩ V ⊥ = { 0 } .
Orthogonal complements V ⊥ = “everything orthogonal to
Orthogonal complements V ⊥ = “everything orthogonal to everything in V ”
Orthogonal complements V ⊥ = “everything orthogonal to everything in V ” ( V ⊥ ) ⊥ = “everything orthogonal to
Orthogonal complements V ⊥ = “everything orthogonal to everything in V ” ( V ⊥ ) ⊥ = “everything orthogonal to everything orthogonal to
Orthogonal complements V ⊥ = “everything orthogonal to everything in V ” ( V ⊥ ) ⊥ = “everything orthogonal to everything orthogonal to everything in V ”
Orthogonal complements V ⊥ = “everything orthogonal to everything in V ” ( V ⊥ ) ⊥ = “everything orthogonal to everything orthogonal to everything in V ” Everything in V is orthogonal to V ⊥ , so V ⊂ ( V ⊥ ) ⊥ .
Orthogonal complements V ⊥ = “everything orthogonal to everything in V ” ( V ⊥ ) ⊥ = “everything orthogonal to everything orthogonal to everything in V ” Everything in V is orthogonal to V ⊥ , so V ⊂ ( V ⊥ ) ⊥ . But from the previous slide, we have:
Orthogonal complements V ⊥ = “everything orthogonal to everything in V ” ( V ⊥ ) ⊥ = “everything orthogonal to everything orthogonal to everything in V ” Everything in V is orthogonal to V ⊥ , so V ⊂ ( V ⊥ ) ⊥ . But from the previous slide, we have: dim V + dim V ⊥ = n
Orthogonal complements V ⊥ = “everything orthogonal to everything in V ” ( V ⊥ ) ⊥ = “everything orthogonal to everything orthogonal to everything in V ” Everything in V is orthogonal to V ⊥ , so V ⊂ ( V ⊥ ) ⊥ . But from the previous slide, we have: dim V + dim V ⊥ = n dim V ⊥ + dim( V ⊥ ) ⊥ = n and similarly
Orthogonal complements V ⊥ = “everything orthogonal to everything in V ” ( V ⊥ ) ⊥ = “everything orthogonal to everything orthogonal to everything in V ” Everything in V is orthogonal to V ⊥ , so V ⊂ ( V ⊥ ) ⊥ . But from the previous slide, we have: dim V + dim V ⊥ = n dim V ⊥ + dim( V ⊥ ) ⊥ = n and similarly So we learn dim V = dim( V ⊥ ) ⊥ ,
Orthogonal complements V ⊥ = “everything orthogonal to everything in V ” ( V ⊥ ) ⊥ = “everything orthogonal to everything orthogonal to everything in V ” Everything in V is orthogonal to V ⊥ , so V ⊂ ( V ⊥ ) ⊥ . But from the previous slide, we have: dim V + dim V ⊥ = n dim V ⊥ + dim( V ⊥ ) ⊥ = n and similarly So we learn dim V = dim( V ⊥ ) ⊥ , hence V = ( V ⊥ ) ⊥ .
Orthogonal complements In words, V = ( V ⊥ ) ⊥ is telling you that:
Orthogonal complements In words, V = ( V ⊥ ) ⊥ is telling you that: Everything orthogonal to
Orthogonal complements In words, V = ( V ⊥ ) ⊥ is telling you that: Everything orthogonal to everything orthogonal to
Orthogonal complements In words, V = ( V ⊥ ) ⊥ is telling you that: Everything orthogonal to everything orthogonal to everything in V
Orthogonal complements In words, V = ( V ⊥ ) ⊥ is telling you that: Everything orthogonal to everything orthogonal to everything in V is already in V .
Orthogonal complements Theorem Let V ⊂ R n be a linear subspace.
Orthogonal complements Theorem Let V ⊂ R n be a linear subspace. Any vector in R n has a unique expression as sum of a vector in V and a vector in V ⊥ .
Orthogonal complements Theorem Let V ⊂ R n be a linear subspace. Any vector in R n has a unique expression as sum of a vector in V and a vector in V ⊥ . In fact, something more general is true:
Orthogonal complements Theorem Let V ⊂ R n be a linear subspace. Any vector in R n has a unique expression as sum of a vector in V and a vector in V ⊥ . In fact, something more general is true: Theorem Given any two subspaces V , W such that dim V + dim W = n and V ∩ W = { 0 } ,
Orthogonal complements Theorem Let V ⊂ R n be a linear subspace. Any vector in R n has a unique expression as sum of a vector in V and a vector in V ⊥ . In fact, something more general is true: Theorem Given any two subspaces V , W such that dim V + dim W = n and V ∩ W = { 0 } , any vector in R n can be written uniquely as a sum of a vector in V and a vector in W .
Example Consider the subspace V ⊂ R 4 spanned by e 1 , e 2 .
Example Consider the subspace V ⊂ R 4 spanned by e 1 , e 2 . Its orthogonal complement V ⊥ is spanned by e 3 , e 4 .
Example Consider the subspace V ⊂ R 4 spanned by e 1 , e 2 . Its orthogonal complement V ⊥ is spanned by e 3 , e 4 . Any vector ( w , x , y , z ) can be written as ( w , x , 0 , 0) + (0 , 0 , y , z ).
Proof
Proof Take bases v 1 , . . . , v v of V and w 1 , . . . , w w of W .
Proof Take bases v 1 , . . . , v v of V and w 1 , . . . , w w of W . Suppose � a i v i + � b j w j = 0.
Proof Take bases v 1 , . . . , v v of V and w 1 , . . . , w w of W . Suppose � a i v i + � b j w j = 0. Then � a i v i = − � b j w j .
Proof Take bases v 1 , . . . , v v of V and w 1 , . . . , w w of W . Suppose � a i v i + � b j w j = 0. Then � a i v i = − � b j w j . One side is in V and the other is in W ,
Proof Take bases v 1 , . . . , v v of V and w 1 , . . . , w w of W . Suppose � a i v i + � b j w j = 0. Then � a i v i = − � b j w j . One side is in V and the other is in W , so both are in V ∩ W ,
Proof Take bases v 1 , . . . , v v of V and w 1 , . . . , w w of W . Suppose � a i v i + � b j w j = 0. Then � a i v i = − � b j w j . One side is in V and the other is in W , so both are in V ∩ W , hence zero.
Proof Take bases v 1 , . . . , v v of V and w 1 , . . . , w w of W . Suppose � a i v i + � b j w j = 0. Then � a i v i = − � b j w j . One side is in V and the other is in W , so both are in V ∩ W , hence zero. { v i } and { w j } were bases,
Proof Take bases v 1 , . . . , v v of V and w 1 , . . . , w w of W . Suppose � a i v i + � b j w j = 0. Then � a i v i = − � b j w j . One side is in V and the other is in W , so both are in V ∩ W , hence zero. { v i } and { w j } were bases, so a i and b j must be zero.
Proof Thus { v 1 , . . . , v v , w 1 , . . . , w w } is linearly independent.
Proof Thus { v 1 , . . . , v v , w 1 , . . . , w w } is linearly independent. This linearly independent set has n elements,
Proof Thus { v 1 , . . . , v v , w 1 , . . . , w w } is linearly independent. This linearly independent set has n elements, so it’s a basis for R n .
Proof Thus { v 1 , . . . , v v , w 1 , . . . , w w } is linearly independent. This linearly independent set has n elements, so it’s a basis for R n . So any vector can be written as � a i v i + � b j w j .
Proof Thus { v 1 , . . . , v v , w 1 , . . . , w w } is linearly independent. This linearly independent set has n elements, so it’s a basis for R n . So any vector can be written as � a i v i + � b j w j . This a sum of a vector in V and a vector in W .
Proof Such an expression is unique:
Proof Such an expression is unique: If v , v ′ ∈ V and w , w ′ ∈ W and v + w = v ′ + w ′ ,
Proof Such an expression is unique: If v , v ′ ∈ V and w , w ′ ∈ W and v + w = v ′ + w ′ , then v − v ′ = w − w ′
Proof Such an expression is unique: If v , v ′ ∈ V and w , w ′ ∈ W and v + w = v ′ + w ′ , then v − v ′ = w − w ′ is in V and in W
Proof Such an expression is unique: If v , v ′ ∈ V and w , w ′ ∈ W and v + w = v ′ + w ′ , then v − v ′ = w − w ′ is in V and in W hence is zero.
Proof Such an expression is unique: If v , v ′ ∈ V and w , w ′ ∈ W and v + w = v ′ + w ′ , then v − v ′ = w − w ′ is in V and in W hence is zero. So v = v ′ and w = w ′ .
Orthogonal projection
Recommend
More recommend