linear algebra and differential equations math 54 lecture
play

Linear algebra and differential equations (Math 54): Lecture 16 - PowerPoint PPT Presentation

Linear algebra and differential equations (Math 54): Lecture 16 Vivek Shende March 14, 2019 Hello and welcome to class! Hello and welcome to class! Last time Hello and welcome to class! Last time We discussed complex eigenvalues and


  1. Try it yourself! Find a basis for the orthogonal complement in R 3 of the subspace spanned by the vector (1 , 2 , 3). This is the same as finding the kernel of the matrix [1 2 3] It’s already row reduced. The kernel is spanned by ( − 2 , 1 , 0) and ( − 3 , 0 , 1).

  2. Orthogonal complements

  3. Orthogonal complements (For notation, recall V was a subspace of R n .)

  4. Orthogonal complements (For notation, recall V was a subspace of R n .) V ⊥ is the kernel of a matrix with dim V linearly independent rows and n columns,

  5. Orthogonal complements (For notation, recall V was a subspace of R n .) V ⊥ is the kernel of a matrix with dim V linearly independent rows and n columns, so dim V ⊥ = n − dim V ,

  6. Orthogonal complements (For notation, recall V was a subspace of R n .) V ⊥ is the kernel of a matrix with dim V linearly independent rows and n columns, so dim V ⊥ = n − dim V , i.e. dim V + dim V ⊥ = dim R n

  7. Orthogonal complements (For notation, recall V was a subspace of R n .) V ⊥ is the kernel of a matrix with dim V linearly independent rows and n columns, so dim V ⊥ = n − dim V , i.e. dim V + dim V ⊥ = dim R n Recall from before, V ∩ V ⊥ = { 0 } .

  8. Orthogonal complements V ⊥ = “everything orthogonal to

  9. Orthogonal complements V ⊥ = “everything orthogonal to everything in V ”

  10. Orthogonal complements V ⊥ = “everything orthogonal to everything in V ” ( V ⊥ ) ⊥ = “everything orthogonal to

  11. Orthogonal complements V ⊥ = “everything orthogonal to everything in V ” ( V ⊥ ) ⊥ = “everything orthogonal to everything orthogonal to

  12. Orthogonal complements V ⊥ = “everything orthogonal to everything in V ” ( V ⊥ ) ⊥ = “everything orthogonal to everything orthogonal to everything in V ”

  13. Orthogonal complements V ⊥ = “everything orthogonal to everything in V ” ( V ⊥ ) ⊥ = “everything orthogonal to everything orthogonal to everything in V ” Everything in V is orthogonal to V ⊥ , so V ⊂ ( V ⊥ ) ⊥ .

  14. Orthogonal complements V ⊥ = “everything orthogonal to everything in V ” ( V ⊥ ) ⊥ = “everything orthogonal to everything orthogonal to everything in V ” Everything in V is orthogonal to V ⊥ , so V ⊂ ( V ⊥ ) ⊥ . But from the previous slide, we have:

  15. Orthogonal complements V ⊥ = “everything orthogonal to everything in V ” ( V ⊥ ) ⊥ = “everything orthogonal to everything orthogonal to everything in V ” Everything in V is orthogonal to V ⊥ , so V ⊂ ( V ⊥ ) ⊥ . But from the previous slide, we have: dim V + dim V ⊥ = n

  16. Orthogonal complements V ⊥ = “everything orthogonal to everything in V ” ( V ⊥ ) ⊥ = “everything orthogonal to everything orthogonal to everything in V ” Everything in V is orthogonal to V ⊥ , so V ⊂ ( V ⊥ ) ⊥ . But from the previous slide, we have: dim V + dim V ⊥ = n dim V ⊥ + dim( V ⊥ ) ⊥ = n and similarly

  17. Orthogonal complements V ⊥ = “everything orthogonal to everything in V ” ( V ⊥ ) ⊥ = “everything orthogonal to everything orthogonal to everything in V ” Everything in V is orthogonal to V ⊥ , so V ⊂ ( V ⊥ ) ⊥ . But from the previous slide, we have: dim V + dim V ⊥ = n dim V ⊥ + dim( V ⊥ ) ⊥ = n and similarly So we learn dim V = dim( V ⊥ ) ⊥ ,

  18. Orthogonal complements V ⊥ = “everything orthogonal to everything in V ” ( V ⊥ ) ⊥ = “everything orthogonal to everything orthogonal to everything in V ” Everything in V is orthogonal to V ⊥ , so V ⊂ ( V ⊥ ) ⊥ . But from the previous slide, we have: dim V + dim V ⊥ = n dim V ⊥ + dim( V ⊥ ) ⊥ = n and similarly So we learn dim V = dim( V ⊥ ) ⊥ , hence V = ( V ⊥ ) ⊥ .

  19. Orthogonal complements In words, V = ( V ⊥ ) ⊥ is telling you that:

  20. Orthogonal complements In words, V = ( V ⊥ ) ⊥ is telling you that: Everything orthogonal to

  21. Orthogonal complements In words, V = ( V ⊥ ) ⊥ is telling you that: Everything orthogonal to everything orthogonal to

  22. Orthogonal complements In words, V = ( V ⊥ ) ⊥ is telling you that: Everything orthogonal to everything orthogonal to everything in V

  23. Orthogonal complements In words, V = ( V ⊥ ) ⊥ is telling you that: Everything orthogonal to everything orthogonal to everything in V is already in V .

  24. Orthogonal complements Theorem Let V ⊂ R n be a linear subspace.

  25. Orthogonal complements Theorem Let V ⊂ R n be a linear subspace. Any vector in R n has a unique expression as sum of a vector in V and a vector in V ⊥ .

  26. Orthogonal complements Theorem Let V ⊂ R n be a linear subspace. Any vector in R n has a unique expression as sum of a vector in V and a vector in V ⊥ . In fact, something more general is true:

  27. Orthogonal complements Theorem Let V ⊂ R n be a linear subspace. Any vector in R n has a unique expression as sum of a vector in V and a vector in V ⊥ . In fact, something more general is true: Theorem Given any two subspaces V , W such that dim V + dim W = n and V ∩ W = { 0 } ,

  28. Orthogonal complements Theorem Let V ⊂ R n be a linear subspace. Any vector in R n has a unique expression as sum of a vector in V and a vector in V ⊥ . In fact, something more general is true: Theorem Given any two subspaces V , W such that dim V + dim W = n and V ∩ W = { 0 } , any vector in R n can be written uniquely as a sum of a vector in V and a vector in W .

  29. Example Consider the subspace V ⊂ R 4 spanned by e 1 , e 2 .

  30. Example Consider the subspace V ⊂ R 4 spanned by e 1 , e 2 . Its orthogonal complement V ⊥ is spanned by e 3 , e 4 .

  31. Example Consider the subspace V ⊂ R 4 spanned by e 1 , e 2 . Its orthogonal complement V ⊥ is spanned by e 3 , e 4 . Any vector ( w , x , y , z ) can be written as ( w , x , 0 , 0) + (0 , 0 , y , z ).

  32. Proof

  33. Proof Take bases v 1 , . . . , v v of V and w 1 , . . . , w w of W .

  34. Proof Take bases v 1 , . . . , v v of V and w 1 , . . . , w w of W . Suppose � a i v i + � b j w j = 0.

  35. Proof Take bases v 1 , . . . , v v of V and w 1 , . . . , w w of W . Suppose � a i v i + � b j w j = 0. Then � a i v i = − � b j w j .

  36. Proof Take bases v 1 , . . . , v v of V and w 1 , . . . , w w of W . Suppose � a i v i + � b j w j = 0. Then � a i v i = − � b j w j . One side is in V and the other is in W ,

  37. Proof Take bases v 1 , . . . , v v of V and w 1 , . . . , w w of W . Suppose � a i v i + � b j w j = 0. Then � a i v i = − � b j w j . One side is in V and the other is in W , so both are in V ∩ W ,

  38. Proof Take bases v 1 , . . . , v v of V and w 1 , . . . , w w of W . Suppose � a i v i + � b j w j = 0. Then � a i v i = − � b j w j . One side is in V and the other is in W , so both are in V ∩ W , hence zero.

  39. Proof Take bases v 1 , . . . , v v of V and w 1 , . . . , w w of W . Suppose � a i v i + � b j w j = 0. Then � a i v i = − � b j w j . One side is in V and the other is in W , so both are in V ∩ W , hence zero. { v i } and { w j } were bases,

  40. Proof Take bases v 1 , . . . , v v of V and w 1 , . . . , w w of W . Suppose � a i v i + � b j w j = 0. Then � a i v i = − � b j w j . One side is in V and the other is in W , so both are in V ∩ W , hence zero. { v i } and { w j } were bases, so a i and b j must be zero.

  41. Proof Thus { v 1 , . . . , v v , w 1 , . . . , w w } is linearly independent.

  42. Proof Thus { v 1 , . . . , v v , w 1 , . . . , w w } is linearly independent. This linearly independent set has n elements,

  43. Proof Thus { v 1 , . . . , v v , w 1 , . . . , w w } is linearly independent. This linearly independent set has n elements, so it’s a basis for R n .

  44. Proof Thus { v 1 , . . . , v v , w 1 , . . . , w w } is linearly independent. This linearly independent set has n elements, so it’s a basis for R n . So any vector can be written as � a i v i + � b j w j .

  45. Proof Thus { v 1 , . . . , v v , w 1 , . . . , w w } is linearly independent. This linearly independent set has n elements, so it’s a basis for R n . So any vector can be written as � a i v i + � b j w j . This a sum of a vector in V and a vector in W .

  46. Proof Such an expression is unique:

  47. Proof Such an expression is unique: If v , v ′ ∈ V and w , w ′ ∈ W and v + w = v ′ + w ′ ,

  48. Proof Such an expression is unique: If v , v ′ ∈ V and w , w ′ ∈ W and v + w = v ′ + w ′ , then v − v ′ = w − w ′

  49. Proof Such an expression is unique: If v , v ′ ∈ V and w , w ′ ∈ W and v + w = v ′ + w ′ , then v − v ′ = w − w ′ is in V and in W

  50. Proof Such an expression is unique: If v , v ′ ∈ V and w , w ′ ∈ W and v + w = v ′ + w ′ , then v − v ′ = w − w ′ is in V and in W hence is zero.

  51. Proof Such an expression is unique: If v , v ′ ∈ V and w , w ′ ∈ W and v + w = v ′ + w ′ , then v − v ′ = w − w ′ is in V and in W hence is zero. So v = v ′ and w = w ′ .

  52. Orthogonal projection

Recommend


More recommend