cs141 intermediate data structures and algorithms greedy
play

CS141: Intermediate Data Structures and Algorithms Greedy - PowerPoint PPT Presentation

CS141: Intermediate Data Structures and Algorithms Greedy Algorithms Amr Magdy Activity Selection Problem Given a set of activities = { 1 , 2 , , } where each activity has a start time and a finish


  1. CS141: Intermediate Data Structures and Algorithms Greedy Algorithms Amr Magdy

  2. Activity Selection Problem Given a set of activities 𝑇 = { 𝑏 1 , 𝑏 2 , … , 𝑏 π‘œ } where each activity 𝑗 has a start time 𝑑 𝑗 and a finish time 𝑔 𝑗 , where 0 ≀ 𝑑 𝑗 < 𝑔 𝑗 < ∞. An activity 𝑏 𝑗 happens in the half-open time interval [ 𝑑 𝑗 , 𝑔 𝑗 ). 2

  3. Activity Selection Problem Given a set of activities 𝑇 = { 𝑏 1 , 𝑏 2 , … , 𝑏 π‘œ } where each activity 𝑗 has a start time 𝑑 𝑗 and a finish time 𝑔 𝑗 , where 0 ≀ 𝑑 𝑗 < 𝑔 𝑗 < ∞. An activity 𝑏 𝑗 happens in the half-open time interval [ 𝑑 𝑗 , 𝑔 𝑗 ). Activities compete on a single resource, e.g., CPU 3

  4. Activity Selection Problem Given a set of activities 𝑇 = { 𝑏 1 , 𝑏 2 , … , 𝑏 π‘œ } where each activity 𝑗 has a start time 𝑑 𝑗 and a finish time 𝑔 𝑗 , where 0 ≀ 𝑑 𝑗 < 𝑔 𝑗 < ∞. An activity 𝑏 𝑗 happens in the half-open time interval [ 𝑑 𝑗 , 𝑔 𝑗 ). Activities compete on a single resource, e.g., CPU Two activities are said to be compatible if they do not overlap . 4

  5. Activity Selection Problem Given a set of activities 𝑇 = { 𝑏 1 , 𝑏 2 , … , 𝑏 π‘œ } where each activity 𝑗 has a start time 𝑑 𝑗 and a finish time 𝑔 𝑗 , where 0 ≀ 𝑑 𝑗 < 𝑔 𝑗 < ∞. An activity 𝑏 𝑗 happens in the half-open time interval [ 𝑑 𝑗 , 𝑔 𝑗 ). Activities compete on a single resource, e.g., CPU Two activities are said to be compatible if they do not overlap . The problem is to find a maximum-size compatible subset , i.e., a one with the maximum number of activities. 5

  6. Example 6

  7. A Compatible Set 7

  8. A Better Compatible Set 8

  9. An Optimal Solution 9

  10. Another Optimal Solution 10

  11. Activity Selection Problem Solution algorithm? Brute force (naΓ―ve): all possible combinations οƒ  O(2 n ) Can we do better? Divide line for D&C is not clear 11

  12. Activity Selection Problem Solution algorithm? Brute force (naΓ―ve): all possible combinations οƒ  O(2 n ) Can we do better? Divide line for D&C is not clear Does the problem have optimal substructure? i.e., the optimal solution of a bigger problem has optimal solutions for subproblems 12

  13. Activity Selection Problem Does the problem have optimal substructure? i.e., the optimal solution of a bigger problem has optimal solutions for subproblems Assume A is an optimal solution for S Is A’ = A -{a i } an optimal solution for S’ = S -{a i and its incompatible activities}? If A’ is not an optimal solution, then there an optimal solution A’’ for S’ so that |A’’| > |A’| Then B=A’’ U { a i } is a solution for S, |B|=|A’’|+1, |A|=|A’|+1 Then |B| > |A|, i.e., |A| is not an optimal solution, contradiction Then A’ must be an optimal solution for S’ 13

  14. Activity Selection Problem Does the problem have optimal substructure? i.e., the optimal solution of a bigger problem has optimal solutions for subproblems Assume A is an optimal solution for S Is A’ = A -{a i } an optimal solution for S’ = S -{a i and its incompatible activities}? If A’ is not an optimal solution, then there an optimal solution A’’ for S’ so that |A’’| > |A’| Then B=A’’ U { a i } is a solution for S, |B|=|A’’|+1, |A|=|A’|+1 Then |B| > |A|, i.e., |A| is not an optimal solution, contradiction Then A’ must be an optimal solution for S’ Proof by contradiction Assume the opposite of your goal Given that prove a contradiction, then your goal is proved 14

  15. Activity Selection Problem What does having optimal substructure means? We can solve smaller problems, then expand to larger Similar to dynamic programming 15

  16. Activity Selection Problem What does having optimal substructure means? We can solve smaller problems, then expand to larger Similar to dynamic programming Instead, can we a greedy choice? i.e., take the best choice so far, reduce the problem size, and solve a subproblem later 16

  17. Activity Selection Problem What does having optimal substructure means? We can solve smaller problems, then expand to larger Similar to dynamic programming Instead, can we a greedy choice? i.e., take the best choice so far, reduce the problem size, and solve a subproblem later Greedy choices Longest first Shortest first Earliest start first Earliest finish first …? 17

  18. Activity Selection Problem Greedy choice: earliest finish first Why? It leaves as much resource as possible for other tasks 18

  19. Activity Selection Problem Greedy choice: earliest finish first Why? It leaves as much resource as possible for other tasks Solution: Include earliest finish activity a m in solution A Remove all a m ’s incompatible activities Repeat for the remaining earliest finish activity 19

  20. Activity Selection Problem: Greedy Solution 20

  21. Activity Selection Problem: Greedy Solution 21

  22. Activity Selection Problem: Greedy Solution 22

  23. Activity Selection Problem: Greedy Solution 23

  24. Activity Selection Problem: Greedy Solution 24

  25. Activity Selection Problem: Greedy Solution 25

  26. Activity Selection Problem: Greedy Solution 26

  27. Activity Selection Problem: Greedy Solution 27

  28. Activity Selection Problem Pseudo code? 28

  29. Activity Selection Problem Pseudo code? findMaxSet(Array a, int n) { - Sort β€œa” based on earliest finish time - result οƒŸ {} - for i = 1 to n validAi = true for j = 1 to result.size if (a[i] is incompatible with result[j]) validAi = false if (validAi) result οƒŸ result U a[i] - return result } 29

  30. Activity Selection Problem Is greedy choice is enough to get optimal solution? 30

  31. Activity Selection Problem Is greedy choice is enough to get optimal solution? Greedy choice property Prove that if a m has the earliest finish time, it must be included in some optimal solution. 31

  32. Activity Selection Problem Is greedy choice is enough to get optimal solution? Greedy choice property Prove that if a m has the earliest finish time, it must be included in some optimal solution. Assume a set S and a solution set A, where a m βˆ‰ A Let a j is the activity with the earliest finish time in A (not in S) Compose another set A’ = A – {a j } U {a m } A’ still have all activities disjoint (as a m has the global earliest finish time and A activities are already disjoint), and |A’|=|A| Then A’ is an optimal solution Then a m is always included in an optimal solution 32

  33. Elements of a Greedy Algorithm Optimal Substructure 1. Greedy Choice Property 2. 33

  34. Greedy vs. Dynamic Programming Solving the bigger problem include One choice (greedy) vs Multiple possible choices 34

  35. Greedy vs. Dynamic Programming Solving the bigger problem include One choice (greedy) vs Multiple possible choices One subproblem A lot of overlapping subproblems 35

  36. Greedy vs. Dynamic Programming Solving the bigger problem include One choice (greedy) vs Multiple possible choices One subproblem A lot of overlapping subproblems Both have optimal substructure 36

  37. Greedy vs. Dynamic Programming Solving the bigger problem include One choice (greedy) vs Multiple possible choices One subproblem A lot of overlapping subproblems Both have optimal substructure Elements: Greedy DM Optimal substructure Optimal substructure Greedy choice property Overlapping subproblems 37

  38. Knapsack Problem 45 38

  39. Knapsack Problem 45 0-1 Knapsack: Each item either included or not Greedy choices: Take the most valuable οƒ  Does not lead to optimal solution Take the most valuable per unit οƒ  Works in this example 39

  40. Knapsack Problem 30 0-1 Knapsack: Each item either included or not Greedy choices: Take the most valuable οƒ  Does not lead to optimal solution Take the most valuable per unit οƒ  Does not work 40

  41. Knapsack Problem 30 Fractional Knapsack: Part of items can be included 41

  42. Knapsack Problem 30 Fractional Knapsack: Part of items can be included Greedy choices: Take the most valuable οƒ  Does not lead to optimal solution Take the most valuable per unit οƒ  Does work 42

  43. Fractional Knapsack Problem Greedy choice property: take the most valuable per weight unit 43

  44. Fractional Knapsack Problem Greedy choice property: take the most valuable per weight unit Proof of optimality: Given the set 𝑇 ordered by the value-per-weight, taking as much as possible 𝑦 π‘˜ from the item π‘˜ with the highest value-per-weight will lead to an optimal solution π‘Œ Assume we have another optimal solution π‘Œ ` where we take less amount of item π‘˜ , say 𝑦 π‘˜ ` < 𝑦 π‘˜ . Since 𝑦 π‘˜ ` < 𝑦 π‘˜ , there must be another item 𝑙 which was taken with a higher amount in π‘Œ `, i.e., 𝑦 𝑙 ` > 𝑦 𝑙 . We create another solution π‘Œ `` by doing the following changes in π‘Œ ` Reduce the amount of item 𝑙 by a value 𝑨 and increase the amount of item π‘˜ by a value 𝑨 The value of the new solution π‘Š `` = π‘Š ` + 𝑨 𝑀 π‘˜ / π‘₯ π‘˜ βˆ’ 𝑨 𝑀 𝑙 / π‘₯ 𝑙 = π‘Š ` + 𝑨 ( 𝑀 π‘˜ / π‘₯ π‘˜ βˆ’ 𝑀 𝑙 / π‘₯ 𝑙 ) οƒ  𝑀 π‘˜ / π‘₯ π‘˜ βˆ’ 𝑀 𝑙 / π‘₯ 𝑙 β‰₯ 0 οƒ  π‘Š `` β‰₯ π‘Š ` 44

  45. Fractional Knapsack Problem Optimal substructure 45

Recommend


More recommend