Announcements Monday, November 19 ◮ You should already have the link to view your graded midterm online. ◮ Course grades will be curved at the end of the semester. The percentage of A’s, B’s, and C’s to be awarded depends on many factors, and will not be determined until all grades are in. ◮ Individual exam grades are not curved. ◮ Send regrade requests by tomorrow . ◮ WeBWorK 6.6, 7.1, 7.2 are due the Wednesday after Thanksgiving. ◮ No more quizzes! ◮ My office is Skiles 244 and Rabinoffice hours are: Mondays, 12–1pm; Wednesdays, 1–3pm. (But not this Wednesday.)
Section 7.2 Orthogonal Complements
Orthogonal Complements Definition Let W be a subspace of R n . Its orthogonal complement is W ⊥ = v in R n | v · w = 0 for all w in W � � read “ W perp”. W ⊥ is orthogonal complement A T is transpose Pictures: The orthogonal complement of a line in R 2 is the W W ⊥ perpendicular line. [interactive] W W ⊥ The orthogonal complement of a line in R 3 is the perpendicular plane. [interactive] W ⊥ W The orthogonal complement of a plane in R 3 is the perpendicular line. [interactive]
Poll
Orthogonal Complements Basic properties Let W be a subspace of R n . Facts: 1. W ⊥ is also a subspace of R n 2. ( W ⊥ ) ⊥ = W 3. dim W + dim W ⊥ = n 4. If W = Span { v 1 , v 2 , . . . , v m } , then W ⊥ = all vectors orthogonal to each v 1 , v 2 , . . . , v m x in R n | x · v i = 0 for all i = 1 , 2 , . . . , m � � = — v T 1 — — v T 2 — . = Nul . . . — v T m —
Orthogonal Complements Computation 1 1 , , compute W ⊥ . Problem: if W = Span 1 1 − 1 1 [interactive] — v T 1 — — v T 2 — Span { v 1 , v 2 , . . . , v m } ⊥ = Nul . . . — v T m —
Orthogonal Complements Row space, column space, null space Definition The row space of an m × n matrix A is the span of the rows of A . It is denoted Row A . Equivalently, it is the column space of A T : Row A = Col A T . It is a subspace of R n . We showed before that if A has rows v T 1 , v T 2 , . . . , v T m , then Span { v 1 , v 2 , . . . , v m } ⊥ = Nul A . Hence we have shown: Fact: (Row A ) ⊥ = Nul A . Replacing A by A T , and remembering Row A T = Col A : Fact: (Col A ) ⊥ = Nul A T . Using property 2 and taking the orthogonal complements of both sides, we get: Fact: (Nul A ) ⊥ = Row A and Col A = (Nul A T ) ⊥ .
Orthogonal Complements Reference sheet Orthogonal Complements of Most of the Subspaces We’ve Seen For any vectors v 1 , v 2 , . . . , v m : — v T 1 — — v T 2 — Span { v 1 , v 2 , . . . , v m } ⊥ = Nul . . . — v T m — For any matrix A : Row A = Col A T and (Row A ) ⊥ = Nul A Row A = (Nul A ) ⊥ (Col A ) ⊥ = Nul A T Col A = (Nul A T ) ⊥ For any other subspace W , first find a basis v 1 , . . . , v m , then use the above trick to compute W ⊥ = Span { v 1 , . . . , v m } ⊥ .
Section 7.3 Orthogonal Projections
Best Approximation Suppose you measure a data point x which you know for theoretical reasons must lie on a subspace W . x x − y W y Due to measurement error, though, the measured x is not actually in W . Best approximation: y is the closest point to x on W . How do you know that y is the closest point? The vector from y to x is orthogonal to W : it is in the orthogonal complement W ⊥ .
Orthogonal Decomposition Theorem Every vector x in R n can be written as x = x W + x W ⊥ for unique vectors x W in W and x W ⊥ in W ⊥ . The equation x = x W + x W ⊥ is called the orthogonal decomposition of x (with respect to W ). The vector x W is the orthogonal projection of x onto W . x The vector x W is the closest vector to x on W . x W ⊥ [interactive 1] [interactive 2] x W W
Orthogonal Decomposition Justification Theorem Every vector x in R n can be written as x = x W + x W ⊥ for unique vectors x W in W and x W ⊥ in W ⊥ . Why?
Orthogonal Decomposition Example Let W be the xy -plane in R 3 . Then W ⊥ is the z -axis. 2 = x = 1 ⇒ x W = x W ⊥ = . 3 a = x = b ⇒ x W = x W ⊥ = . c This is just decomposing a vector into a “horizontal” component (in the xy -plane) and a “vertical” component (on the z -axis). x x W ⊥ [interactive] x W W
Orthogonal Decomposition Computation? Problem: Given x and W , how do you compute the decomposition x = x W + x W ⊥ ? Observation: It is enough to compute x W , because x W ⊥ = x − x W .
The A T A trick Theorem (The A T A Trick) Let W be a subspace of R n , let v 1 , v 2 , . . . , v m be a spanning set for W (e.g., a basis), and let | | | . A = v 1 v 2 · · · v m | | | Then for any x in R n , the matrix equation A T Av = A T x (in the unknown vector v ) is consistent, and x W = Av for any solution v . Recipe for Computing x = x W + x W ⊥ ◮ Write W as a column space of a matrix A . ◮ Find a solution v of A T Av = A T x (by row reducing). ◮ Then x W = Av and x W ⊥ = x − x W .
The A T A Trick Example Problem: Compute the orthogonal projection of a vector x = ( x 1 , x 2 , x 3 ) in R 3 onto the xy -plane.
The A T A Trick Another Example Problem: Let 1 x 1 in R 3 � � x 1 − x 2 + x 3 = 0 x = 2 W = . x 2 3 x 3 Compute the distance from x to W .
The A T A Trick Another Example, Continued Problem: Let 1 x 1 in R 3 � � x 1 − x 2 + x 3 = 0 x = 2 W = . x 2 3 x 3 Compute the distance from x to W . [interactive]
The A T A trick Proof Theorem (The A T A Trick) Let W be a subspace of R n , let v 1 , v 2 , . . . , v m be a spanning set for W (e.g., a basis), and let | | | . A = · · · v 1 v 2 v m | | | Then for any x in R n , the matrix equation A T Av = A T x (in the unknown vector v ) is consistent, and x W = Av for any solution v . Proof:
Orthogonal Projection onto a Line Problem: Let L = Span { u } be a line in R n and let x be a vector in R n . Compute x L . Projection onto a Line The projection of x onto a line L = Span { u } is x L = u · x x L ⊥ = x − x L . u · u u x x L ⊥ u x L = u · x u · u u L
Orthogonal Projection onto a Line Example � − 6 � Problem: Compute the orthogonal projection of x = onto the line L 4 � 3 � spanned by u = , and find the distance from u to L . 2 � � − 6 4 � � 3 2 − 10 � � 3 2 13 L [interactive]
Summary Let W be a subspace of R n . ◮ The orthogonal complement W ⊥ is the set of all vectors orthogonal to everything in W . ◮ We have ( W ⊥ ) ⊥ = W and dim W + dim W ⊥ = n . ◮ Row A = Col A T , (Row A ) ⊥ = Nul A , Row A = (Nul A ) ⊥ , (Col A ) ⊥ = Nul A T , Col A = (Nul A T ) ⊥ . ◮ Orthogonal decomposition: any vector x in R n can be written in a unique way as x = x W + x W ⊥ for x W in W and x W ⊥ in W ⊥ . The vector x W is the orthogonal projection of x onto W . ◮ The vector x W is the closest point to x in W : it is the best approximation . ◮ The distance from x to W is � x W ⊥ � . ◮ If W = Col A then to compute x W , solve the equation A T Av = A T x ; then x W = Av . ◮ If W = L = Span { u } is a line then x L = u · x u · u u .
Recommend
More recommend