cse101 algorithm design and analysis
play

CSE101: Algorithm Design and Analysis Russell Impagliazzo Sanjoy - PowerPoint PPT Presentation

CSE101: Algorithm Design and Analysis Russell Impagliazzo Sanjoy Dasgupta Ragesh Jaiswal (Thanks for slides: Miles Jones) Lecture 24: Divide and Conquer (Tree and Computational Geometry) Divide and Conquer Trees Lets say we have a full


  1. CSE101: Algorithm Design and Analysis Russell Impagliazzo Sanjoy Dasgupta Ragesh Jaiswal (Thanks for slides: Miles Jones) Lecture 24: Divide and Conquer (Tree and Computational Geometry)

  2. Divide and Conquer Trees • Let’s say we have a full and balanced binary tree (all parents have two children and all leaves are on the bottom level.)

  3. Divide and Conquer Trees • Notice that each child’s subtree is half of the problem so we get a nice divide and conquer structure.

  4. Divide and Conquer Trees • If the tree is uneven, we can still use the same strategy but we need to take a bit of care when calculating runtime.

  5. Least common ancestor • Given a binary tree with 𝑜 vertices, we wish to compute 𝑀𝐷𝐵(𝑦, 𝑧) for each pair of vertices 𝑦, 𝑧 . • 𝑀𝐷𝐵(𝑦, 𝑧) is the least common ancestor of 𝑦 and 𝑧 . Or in other words, the “youngest” common ancestor of 𝑦 and 𝑧 . • For example, the LCA of me and my brother is our parent. The LCA of me and my uncle is my grandparent (his parent.) A vertex can be its own ancestor so the LCA of me and my father is my father.

  6. Least common ancestor • What pairs of vertices will have the root 𝑠 as their least common ancestor?

  7. Least common ancestor • What pairs of vertices will have the root 𝑠 as their least common ancestor? • For each vertex 𝑤 , set 𝑚𝑑𝑏 𝑤, 𝑠 = 𝑠 . • For each pair of vertices 𝑣, 𝑤 such that 𝑣 is in the left subtree and 𝑤 is in the right subtree, set 𝑚𝑑𝑏 𝑣, 𝑤 = 𝑠 . • Now what? Are we done? • Recurse on the left and right subtrees!!!!!

  8. Pseudocode Def LCA (r): Lsubtree = explore (r.lc) Rsubtree = explore (r.rc) for all vertices 𝑣 in Lsubtree: 𝑚𝑑𝑏 𝑣, 𝑠 = 𝑠 for all vertices 𝑤 in Rsubtree: 𝑚𝑑𝑏 𝑠, 𝑤 = 𝑠 for all vertices 𝑣 in Lsubtree: for all vertices 𝑤 in Rsubtree: 𝑚𝑑𝑏 𝑣, 𝑤 = 𝑠 LCA (r.lc) LCA (r.rc)

  9. Pseudocode (runtime) If the binary tree is balanced, then Def LCA (r): each recursive call is of size !"# Lsubtree = explore (r.lc) $ or roughly half. Rsubtree = explore (r.rc) How long does the non-recursive for all vertices 𝑣 in Lsubtree: part take? 𝑚𝑑𝑏 𝑣, 𝑠 = 𝑠 for all vertices 𝑤 in Rsubtree: 𝑚𝑑𝑏 𝑠, 𝑤 = 𝑠 for all vertices 𝑣 in Lsubtree: for all vertices 𝑤 in Rsubtree: 𝑚𝑑𝑏 𝑣, 𝑤 = 𝑠 LCA (r.lc) LCA (r.rc)

  10. Pseudocode (runtime) If the binary tree is balanced, then Def LCA (r): each recursive call is of size !"# Lsubtree = explore (r.lc) $ or roughly half. Rsubtree = explore (r.rc) How long does the non-recursive for all vertices 𝑣 in Lsubtree: part take? 𝑚𝑑𝑏 𝑣, 𝑠 = 𝑠 for all vertices 𝑤 in Rsubtree: 𝑈 𝑜 = 2𝑈 𝑜 − 1 + O n $ 𝑚𝑑𝑏 𝑠, 𝑤 = 𝑠 2 for all vertices 𝑣 in Lsubtree: for all vertices 𝑤 in Rsubtree: Using the master theorem with a=2, b=2, d=2, 𝑚𝑑𝑏 𝑣, 𝑤 = 𝑠 LCA (r.lc) 𝑈 𝑜 = 𝑃 𝑜 $ LCA (r.rc)

  11. Pseudocode (runtime uneven) If the binary tree is uneven then Def LCA (r): the runtime recurrence is Lsubtree = explore (r.lc) 𝑈 𝑜 = 𝑈 𝑀 + 𝑈 𝑆 + 𝑃 𝑀𝑆 Rsubtree = explore (r.rc) Where 𝑀 is the size of the left for all vertices 𝑣 in Lsubtree: subrtree and 𝑆 is the size of the right subtree. 𝑚𝑑𝑏 𝑣, 𝑠 = 𝑠 for all vertices 𝑤 in Rsubtree: What do you think the total 𝑚𝑑𝑏 𝑠, 𝑤 = 𝑠 runtime will be? Take a guess and for all vertices 𝑣 in Lsubtree: we can check it!!! for all vertices 𝑤 in Rsubtree: 𝑚𝑑𝑏 𝑣, 𝑤 = 𝑠 LCA (r.lc) LCA (r.rc)

  12. Uneven DC runtime • 𝑈 𝑜 = 𝑈 𝑀 + 𝑈 R + O LR • We guess that it would take 𝑃 𝑜 ! . So let’s try to prove this using induction. • Claim: 𝑈 𝑜 ≤ 𝑑𝑜 ! for all 𝑜 ≥ 1 and for some constant 𝑑 that is bigger than 𝑈(1) and bigger than the coefficient in the 𝑃(𝑀𝑆) term.

  13. Uneven DC runtime • Base case. 𝑈 1 < 𝑑(1 ! ) . True by choice of 𝑑 . • Suppose that for some 𝑜 > 1 , 𝑈 𝑙 < 𝑑𝑙 ! for all 𝑙 such that 1 ≤ 𝑙 < 𝑜. • Then 𝑈 𝑜 < 𝑈 𝑀 + 𝑈 𝑆 + 𝑑𝑀𝑆 ≤ 𝑑𝑀 ! + 𝑑𝑆 ! + 𝑑𝑀𝑆 < 𝑑𝑀 ! + 𝑑𝑆 ! + 2𝑑𝑀𝑆 = 𝑑 𝑀 + 𝑆 ! = 𝑑 𝑜 − 1 ! < 𝑑𝑜 !

  14. Make Heap • Problem: Given a list of n elements, form a heap containing all elements.

  15. Divide and conquer strategy • Assume 𝑜 = 2 " − 1. (Add blank elements if needed) • Divide the list into two lists of size #$% ! and a left-over element • Make heaps with both (in sub-trees of root) • Put left-over element at root. • “Trickle down” top element to reinstate heap property

  16. Time analysis • To solve one problem, we solve two problems of half the size, and then spend constant time per depth of the tree. • T(n) = T( ) + O( )

  17. Time analysis • To solve one problem, we solve two problems of half the size, and then spend constant time per depth of the tree. • T(n) = 2 T( n/2 ) + O(log n ) • Doesn’t fit master theorem.

  18. Time analysis: sandwiching • To solve one problem, we solve two problems of half the size, and then spend constant time per depth of the tree. • T(n) = 2 T( n/2 ) + O(log n ) ! • Define L(n) =2 T(n/2) + O(1), H(n) = 2T(n/2) + 𝑃 𝑜 " • L(n) < T(n) < H(n) • Apply Master Theorem: Both L(n) and H(n) are O(n), • So T(n) is O(n)

  19. minimum distance • Given a list of coordinates, [ 𝑦 % , 𝑧 % , … , 𝑦 # , 𝑧 # ] , find the distance between the closest pair. • Brute force solution? • min = 0 • for i from 1 to n-1: • for j from i+1 to n: if min > distance( 𝑦 ! , 𝑧 ! , (𝑦 " , 𝑧 " ) ) • • return min

  20. Example 𝑧 𝑦 𝑦 !

  21. Example 𝑧 𝑦 𝑦 !

  22. Divide and conquer • Partition the points by x, according to whether they are to the left or right of the median • Recursively find the minimum distance points on the two sides. • Need to compare to the smallest “cross distance” between a point on the left and a point on the right • Only need to look at “close” points

  23. Combine • How will we use this information to find the distance of the closest pair in the whole set? • We must consider if there is a closest pair where one point is in the left half and one is in the right half. • How do we do this? • Let 𝑒 = min(𝑒 + , 𝑒 , ) and compare only the points (𝑦 - , 𝑧 - ) such that 𝑦 . − 𝑒 ≤ 𝑦 - and 𝑦 - ≤ 𝑦 . + 𝑒 .

  24. Example 𝑄 # 𝑧 𝑦 𝑦 !

  25. Combine • How will we use this information to find the distance of the closest pair in the whole set? • We must consider if there is a closest pair where one point is in the left half and one is in the right half. • How do we do this? • Let 𝑒 = min(𝑒 4 , 𝑒 5 ) and compare only the points (𝑦 6 , 𝑧 6 ) such that 𝑦 7 − 𝑒 ≤ 𝑦 6 and 𝑦 6 ≤ 𝑦 7 + 𝑒 . • Worst case, how many points could this be?

  26. Combine step • Given a point 𝑦, 𝑧 ∈ 𝑄 ! , let’s look in a 2𝑒×𝑒 rectangle with that point at its upper boundary: ! ! " × There could not be more than 8 points total because if we divide the rectangle into 8 " squares then there • can never be more than one point per square. Why??? •

  27. Combine step • So instead of comparing (𝑦, 𝑧) with every other point in 𝑄 ! we only have to compare it with at most a constant c points lower than it (smaller y) • To gain quick access to these points, let’s sort the points in 𝑄 ! by 𝑧 values. • The points above must be in the c points before our current point in this sorted list • Now, if there are 𝑙 vertices in 𝑄 ! we have to sort the vertices in 𝑃(𝑙log 𝑙) time and make at most c𝑙 comparisons in 𝑃(𝑙) time for a total combine step of 𝑃 𝑙 log 𝑙 . • But we said in the worst case, there are 𝑜 vertices in 𝑄 ! and so worst case, the combine step takes 𝑃(𝑜 log 𝑜) time.

  28. Time analysis • But we said in the worst case, there are 𝑜 vertices in 𝑄 ! and so worst case, the combine step takes 𝑃(𝑜 log 𝑜) time. • Runtime recursion: 𝑈 𝑜 = 2𝑈 𝑜 2 + 𝑃(𝑜 log 𝑜) This is T(n) = O(n (log n)^2) Pre-processing : Sort by both x and y, keep pointers between sorted lists Maintain sorting in recursive calls reduces to T(n) =2 T(n/2) +O(n), so T(n) is O(n log n)

Recommend


More recommend