9 orthogonalization finding the closest point in a plane
play

[9] Orthogonalization Finding the closest point in a plane Goal: - PowerPoint PPT Presentation

Orthogonalization [9] Orthogonalization Finding the closest point in a plane Goal: Given a point b and a plane, find the point in the plane closest to b . Finding the closest point in a plane Goal: Given a point b and a plane, find the point in


  1. Orthogonalization [9] Orthogonalization

  2. Finding the closest point in a plane Goal: Given a point b and a plane, find the point in the plane closest to b .

  3. Finding the closest point in a plane Goal: Given a point b and a plane, find the point in the plane closest to b . By translation, we can assume the plane includes the origin. The plane is a vector space V . Let { v 1 , v 2 } be a basis for V . Goal: Given a point b , find the point in Span { v 1 , v 2 } closest to b . Example: v 1 = [8 , − 2 , 2] and v 2 = [4 , 2 , 4] b = [5 , − 5 , 2] point in plane closest to b : [6 , − 3 , 0].

  4. Closest-point problem in in higher dimensions Goal: An algorithm that, given a vector b and vectors v 1 , . . . , v n , finds the vector in Span { v 1 , . . . , v n } that is closest to b . Special case: We can use the algorithm to determine whether b lies in Span { v 1 , . . . , v n } : If the vector in Span { v 1 , . . . , v n } closest to b is b itself then clearly b is in the span; if not, then b is not in the span.    v 1 Let A = · · · v n  . Using the linear-combinations interpretation of matrix-vector multiplication, a vector in Span { v 1 , . . . , v n } can be written A x . Thus testing if b is in Span { v 1 , . . . , v n } is equivalent to testing if the equation A x = b has a solution. More generally: Even if A x = b has no solution, we can use the algorithm to find the point in { A x : x ∈ R n } closest to b . Moreover: We hope to extend the algorithm to also find the best solution x .

  5. Closest point and coefficients Not enough to find the point p in Span { v 1 , . . . , v n } closest to b .... We need an algorithm to find the representation of p in terms of v 1 , . . . , v n . Goal: find the coefficients x 1 , . . . , x n so that x 1 v 1 + · · · + x n v n is the vector in Span { v 1 , . . . , v n } closest to b . � � � �       � � � �  b  v 1  x � �  − � � Equivalent: Find the vector x that minimizes v n · · · � �   � � � � � � � � � � 2 � �       � � � � � �  b  −  v 1  x � � � � Equivalent: Find the vector x that minimizes v n · · · � �   � � � � � � � � � � 2 � �   � � a 1     � � � � � � . � �  b  x  − Equivalent: Find the vector x that minimizes . � �   � � .  � � � �   � � a m � � � � � � Equivalent: Find the vector x that minimizes ( b [1] − a 1 · x ) 2 + · · · + ( b [ m ] − a m · x ) 2 This last problem was addressed using gradient descent in Machine Learning lab.

  6. Closest point and least squares 2 � � � �       � � � �  b  v 1  x � �  − � � Find the vector x that minimizes v n · · · � �   � � � � � � � � � � Equivalent: Find the vector x that minimizes ( b [1] − a 1 · x ) 2 + · · · + ( b [ m ] − a m · x ) 2 This problem is called least squares (”m´ ethode des moindres carr´ es”, due to Adrien-Marie Legendre but often attributed to Gauss) Equivalent: Given a matrix equation A x = b that might have no solution, find the best solution available in the sense that the norm of the error b − A x is as small as possible. ◮ There is an algorithm based on Gaussian elimination. ◮ We will develop an algorithm based on orthogonality (used in solver ) Much faster and more reliable than gradient descent.

  7. High-dimensional projection onto/orthogonal to For any vector b and any vector a , define vectors b || a and b ⊥ a so that b = b || a + b ⊥ a and there is a scalar σ ∈ R such that b || a = σ a and b ⊥ a is orthogonal to a Definition: For a vector b and a vector space V , we define the projection of b onto V (written b ||V ) and the projection of b orthogonal to V (written b ⊥V ) so that b = b ||V + b ⊥V and b ||V is in V , and b ⊥V is orthogonal to every vector in V . b projection onto V projection orthogonal to V b ⊥ V b || V + b =

  8. High-Dimensional Fire Engine Lemma Definition: For a vector b and a vector space V , we define the projection of b onto V (written b ||V ) and the projection of b orthogonal to V (written b ⊥V ) so that b = b ||V + b ⊥V and b ||V is in V , and b ⊥V is orthogonal to every vector in V . One-dimensional Fire Engine Lemma: The point in Span { a } closest to b is b || a and the distance is � b ⊥ a � . High-Dimensional Fire Engine Lemma: The point in a vector space V closest to b is b ||V and the distance is � b ⊥V � .

  9. Finding the projection of b orthogonal to Span { a 1 , . . . , a n } High-Dimensional Fire Engine Lemma: Let b be a vector and let V be a vector space. The vector in V closest to b is b ||V . The distance is � b ⊥V � . Suppose V is specified by generators v 1 , . . . , v n Goal: An algorithm for computing b ⊥V in this case. ◮ input: vector b , vectors v 1 , . . . , v n ◮ output: projection of b orthogonal to V = Span { v 1 , . . . , v n } We already know how to solve this when n = 1: def project_orthogonal_1(b, v): return b - project_along(b, v) Let’s try to generalize....

  10. project orthogonal(b, vlist) def project_orthogonal_1(b, v): return b - project_along(b, v) ⇓ def project_orthogonal(b, vlist): for v in vlist: b = b - project_along(b, v) return b Reviews are in.... “Short, elegant, .... and flawed” “Beautiful—if only it worked!” “A tragic failure.”

  11. project orthogonal(b, vlist) doesn’t work def project_orthogonal(b, vlist): b = [1,1] for v in vlist: vlist =[ [1 , 0], √ √ b = b - project_along(b, v) 2 2 [ 2 ] ] 2 , ( 1,1 ) return b ( √ 2/2, √ 2/2) b Let b i be value of the variable b after i iterations. ( 1,0 ) b 1 = b 0 − (projection of [1 , 1] along [1 , 0]) b 0 − [1 , 0] = = [0 , 1] √ √ 2 2 b 2 b 1 − (projection of [0 , 1] along [ = 2 , 2 ]) b 1 − [1 2 , 1 = 2] [ − 1 2 , 1 = 2] which is not orthogonal to [1 , 0]

  12. project orthogonal(b, vlist) doesn’t work def project_orthogonal(b, vlist): b = [1,1] for v in vlist: vlist =[ [1 , 0], √ √ b = b - project_along(b, v) 2 2 [ 2 ] ] 2 , ( 1,1 ) return b ( √ 2/2, √ 2/2) b Let b i be value of the variable b after i iterations. ( 1,0 ) b 1 = b 0 − (projection of [1 , 1] along [1 , 0]) b 0 − [1 , 0] = = [0 , 1] √ √ 2 2 b 2 b 1 − (projection of [0 , 1] along [ = 2 , 2 ]) b 1 − [1 2 , 1 = 2] [ − 1 2 , 1 = 2] which is not orthogonal to [1 , 0]

  13. project orthogonal(b, vlist) doesn’t work ( 1,1 ) def project_orthogonal(b, vlist): b = [1,1] ( √ 2/2, √ 2/2) b for v in vlist: vlist =[ [1 , 0], √ √ b = b - project_along(b, v) 2 2 [ 2 ] ] 2 , ( 1,0 ) return b Let b i be value of the variable b after i iterations. ( 0,1 ) b 1 ( √ 2/2, √ 2/2) b 1 = b 0 − (projection of [1 , 1] along [1 , 0]) b 0 − [1 , 0] = ( 1,0 ) = [0 , 1] √ √ 2 2 b 2 b 1 − (projection of [0 , 1] along [ = 2 , 2 ]) b 1 − [1 2 , 1 = 2] [ − 1 2 , 1 = 2] which is not orthogonal to [1 , 0]

  14. project orthogonal(b, vlist) doesn’t work def project_orthogonal(b, vlist): b = [1,1] for v in vlist: vlist =[ [1 , 0], √ √ ( 0,1 ) b = b - project_along(b, v) 2 2 [ 2 ] ] 2 , return b b 1 ( √ 2/2, √ 2/2) Let b i be value of the variable b after i iterations. ( 1,0 ) b 1 = b 0 − (projection of [1 , 1] along [1 , 0]) b 0 − [1 , 0] = = [0 , 1] √ √ 2 2 b 2 b 1 − (projection of [0 , 1] along [ = 2 , 2 ]) b 1 − [1 2 , 1 = 2] [ − 1 2 , 1 = 2] which is not orthogonal to [1 , 0]

  15. project orthogonal(b, vlist) doesn’t work def project_orthogonal(b, vlist): b = [1,1] for v in vlist: vlist =[ [1 , 0], √ √ b = b - project_along(b, v) 2 2 [ 2 ] ] 2 , return b Let b i be value of the variable b after i iterations. ( 1,1 ) ( √ 2/2, √ 2/2) b 1 = b 0 − (projection of [1 , 1] along [1 , 0]) (- 1/2,1/2 ) b 0 − [1 , 0] = ( 1,0 ) b 2 = [0 , 1] √ √ 2 2 b 2 b 1 − (projection of [0 , 1] along [ = 2 , 2 ]) b 1 − [1 2 , 1 = 2] [ − 1 2 , 1 = 2] which is not orthogonal to [1 , 0]

  16. How to repair project orthogonal(b, vlist) ? def project_orthogonal(b, vlist): b = [1,1] Final vector is not for v in vlist: vlist =[ [1 , 0], orthogonal to [1 , 0] √ √ b = b - project_along(b, v) 2 2 [ 2 , 2 ] ] return b Maybe the problem will go away if the algorithm ◮ first finds the projection of b along each of the vectors in vlist , and ◮ only afterwards subtracts all these projections from b . def classical_project_orthogonal(b, vlist): w = all-zeroes-vector for v in vlist: w = w + project_along(b, v) return b - w Alas, this procedure also does not work. For the inputs √ √ 2 2 b = [1 , 1] , vlist = [ [1 , 0] , [ 2 , 2 ] ] the output vector is [ − 1 , 0] which is orthogonal to neither of the two vectors in vlist .

  17. What to do with project orthogonal(b, vlist) ? Try it with two vectors v 1 and v 2 that are orthogonal... v 1 = [1 , 2 , 1] v 2 = [ − 1 , 2 , − 1] b = [1 , 1 , 2] b 0 − b 0 · v 1 b 1 v 1 · v 1 v 1 = [1 , 1 , 2] − 5 = 6 [1 , 2 , 1] � 1 � 6 , − 4 6 , 7 = 6 b 1 − b 1 · v 2 b 2 v 2 · v 2 v 2 = � 1 6 , − 4 6 , 7 � − 1 = 2 [ − 1 , 0 , 1] 6 � 2 � 3 , − 2 3 , 2 and note b 2 is orthogonal to v 1 and v 2 . = 3

Recommend


More recommend