csc373
play

CSC373 Weeks 9 & 10: Approximation Algorithms & Local - PowerPoint PPT Presentation

CSC373 Weeks 9 & 10: Approximation Algorithms & Local Search 373F19 - Nisarg Shah & Karan Singh 1 NP-Completeness We saw that many problems are NP-complete Unlikely to have polynomial time algorithms to solve them What


  1. CSC373 Weeks 9 & 10: Approximation Algorithms & Local Search 373F19 - Nisarg Shah & Karan Singh 1

  2. NP-Completeness • We saw that many problems are NP-complete ➢ Unlikely to have polynomial time algorithms to solve them ➢ What can we do? • One idea: ➢ Instead of solving them exactly, solve them approximately ➢ Sometimes, we might want to use an approximation algorithm even when we can compute an exact solution in polynomial time (WHY?) 373F19 - Nisarg Shah & Karan Singh 2

  3. Approximation Algorithms • We’ll focus on optimization problems ➢ Decision problem: “Is there…where… ≥ 𝑙 ?” o E.g. “Is there an assignment which satisfies at least 𝑙 clauses of a given formula 𝜒 ?” ➢ Optimization problem: “Find…which maximizes…” o E.g. “Find an assignment which satisfies the maximum possible number of clauses from a given formula 𝜒 .” ➢ Recall that if the decision problem is hard, then the optimization problem is hard too 373F19 - Nisarg Shah & Karan Singh 3

  4. Approximation Algorithms • There is a function 𝑄𝑠𝑝𝑔𝑗𝑢 we want to maximize or a function 𝐷𝑝𝑡𝑢 we want to minimize • Given input instance 𝐽 … ➢ Our algorithm returns a solution 𝐵𝑀𝐻(𝐽) ➢ An optimal solution maximizing 𝑄𝑠𝑝𝑔𝑗𝑢 or minimizing 𝐷𝑝𝑡𝑢 is 𝑃𝑄𝑈(𝐽) ➢ Then, the approximation ratio of 𝐵𝑀𝐻 on instance 𝐽 is 𝑄𝑠𝑝𝑔𝑗𝑢 𝑃𝑄𝑈 𝐽 𝐷𝑝𝑡𝑢 𝐵𝑀𝐻 𝐽 or 𝑄𝑠𝑝𝑔𝑗𝑢 𝐵𝑀𝐻 𝐽 𝐷𝑝𝑡𝑢 𝑃𝑄𝑈 𝐽 373F19 - Nisarg Shah & Karan Singh 4

  5. Approximation Algorithms • Approximation ratio of 𝐵𝑀𝐻 on instance 𝐽 is 𝑄𝑠𝑝𝑔𝑗𝑢 𝑃𝑄𝑈 𝐽 𝐷𝑝𝑡𝑢 𝐵𝑀𝐻 𝐽 or 𝑄𝑠𝑝𝑔𝑗𝑢 𝐵𝑀𝐻 𝐽 𝐷𝑝𝑡𝑢 𝑃𝑄𝑈 𝐽 ➢ Note: These are defined to be ≥ 1 in each case. o 2 -approximation = half the optimal profit / twice the optimal cost • 𝐵𝑀𝐻 has worst-case 𝑑 -approximation if for each instance 𝐽 … ≥ 1 𝑄𝑠𝑝𝑔𝑗𝑢 𝐵𝑀𝐻 𝐽 𝑑 ⋅ 𝑄𝑠𝑝𝑔𝑗𝑢 𝑃𝑄𝑈 𝐽 𝑝𝑠 𝐷𝑝𝑡𝑢 𝐵𝑀𝐻 𝐽 ≤ 𝑑 ⋅ 𝐷𝑝𝑡𝑢 𝑃𝑄𝑈 𝐽 373F19 - Nisarg Shah & Karan Singh 5

  6. Note • By default, when we say 𝑑 -approximation, we will always mean 𝑑 -approximation in the worst case ➢ Also interesting to look at approximation in the average case when your inputs are drawn from some distribution • Our use of approximation ratios ≥ 1 is just a convention ➢ Some books and papers use approximation ratios ≤ 1 convention ➢ E.g. they might say 0.5 -approximation to mean that the algorithm generates at least half the optimal profit or has at most twice the optimal cost 373F19 - Nisarg Shah & Karan Singh 6

  7. PTAS and FPTAS • Arbitrarily close to 1 approximations • FPTAS: Fully polynomial time approximation scheme ➢ For every 𝜗 > 0 , there is a 1 + 𝜗 -approximation algorithm that runs in time 𝑞𝑝𝑚𝑧 𝑜, Τ 1 𝜗 on instances of size 𝑜 • PTAS: Polynomial time approximation scheme ➢ For every 𝜗 > 0 , there is a 1 + 𝜗 -approximation algorithm that runs in time 𝑞𝑝𝑚𝑧 𝑜 on instances of size 𝑜 o Note: Could have exponential dependence on Τ 1 𝜗 373F19 - Nisarg Shah & Karan Singh 7

  8. Approximation Landscape Impossibility of better approximations ➢ An FPTAS assuming widely held beliefs like P ≠ NP o E.g. the knapsack problem 𝑜 = parameter of problem at hand ➢ A PTAS but no FPTAS o E.g. the makespan problem (we’ll see) ➢ 𝑑 -approximation for a constant 𝑑 > 1 but no PTAS o E.g. vertex cover and JISP (we’ll see) ➢ Θ log 𝑜 -approximation but no constant approximation o E.g. set cover ➢ No 𝑜 1−𝜗 -approximation for any 𝜗 > 0 o E.g. graph coloring and maximum independent set 373F19 - Nisarg Shah & Karan Singh 8

  9. Makespan Minimization 373F19 - Nisarg Shah & Karan Singh 9

  10. Makespan • Problem ➢ Input: 𝑛 identical machines, 𝑜 jobs, job 𝑘 requires processing time 𝑢 𝑘 ➢ Output: Assign jobs to machines to minimize makespan ➢ Let 𝑇 𝑗 = set of jobs assigned to machine 𝑗 in a solution ➢ Constraints: o Each job must run contiguously on one machine o Each machine can process at most one job at a time ➢ Load on machine 𝑗 : 𝑀 𝑗 = σ 𝑘∈𝑇 𝑗 𝑢 𝑘 ➢ Goal: minimize makespan 𝑀 = max 𝑀 𝑗 𝑗 373F19 - Nisarg Shah & Karan Singh 10

  11. Makespan • Even the special case of 𝑛 = 2 machines is already NP-hard by reduction from PARTITION • PARTITION ➢ Input: Set 𝑇 containing 𝑜 integers ➢ Output: Can we partition 𝑇 into two sets with equal sum (i.e. 𝑇 = 𝑇 1 ∩ 𝑇 2 , 𝑇 1 ∩ 𝑇 2 = ∅ , and σ 𝑥∈𝑇 1 𝑥 = σ 𝑥∈𝑇 2 𝑥 )? ➢ Exercise! o Show that PARTITION is NP-complete by reduction from SUBSET-SUM o Show that if there is a polynomial-time algorithm for solving MAKESPAN with 2 machines, then you can solve PARTITION in polynomial-time 373F19 - Nisarg Shah & Karan Singh 11

  12. Makespan • Greedy list-scheduling algorithm ➢ Consider the 𝑜 jobs in some “nice” sorted order. ➢ Assign each job 𝑘 to a machine with the smallest load so far • Note ➢ Implementable in 𝑃 𝑜 log 𝑛 using priority queue • Back to greedy…? ➢ But this time, we can’t hope that greedy will be optimal ➢ We can still hope that it is approximately optimal • Which order? 373F19 - Nisarg Shah & Karan Singh 12

  13. Makespan • Theorem [Graham 1966] ➢ Regardless of the order, greedy gives a 2 -approximation. ➢ This was the first worst-case approximation analysis • Let optimal makespan = 𝑀 ∗ • To show that makespan under greedy solution is not much worse than 𝑀 ∗ , we need to show that 𝑀 ∗ isn’t too low 373F19 - Nisarg Shah & Karan Singh 13

  14. Makespan • Theorem [Graham 1966] ➢ Regardless of the order, greedy gives a 2 -approximation. • Fact 1: 𝑀 ∗ ≥ max 𝑢 𝑘 𝑘 ➢ Some machine must process job with highest processing time 1 • Fact 2: 𝑀 ∗ ≥ 𝑛 σ 𝑘 𝑢 𝑘 ➢ Total processing time is σ 𝑘 𝑢 𝑘 ➢ At least one machine must do at least 1/𝑛 of this work (pigeonhole principle) 373F19 - Nisarg Shah & Karan Singh 14

  15. Makespan • Theorem [Graham 1966] ➢ Regardless of the order, greedy gives a 2 -approximation. • Proof: ➢ Suppose machine 𝑗 is bottleneck under greedy (so load = 𝑀 𝑗 ) ➢ Let 𝑘 ∗ = last job scheduled on 𝑗 by greedy ➢ Right before 𝑘 ∗ was assigned to 𝑗 , 𝑗 had the smallest load o Load of other machines could have only increased from then o 𝑀 𝑗 − 𝑢 𝑘 ∗ ≤ 𝑀 𝑙 , ∀𝑙 1 ➢ Average over all 𝑙 : 𝑀 𝑗 − 𝑢 𝑘 ∗ ≤ 𝑛 σ 𝑘 𝑢 𝑘 Fact 1 𝑛 σ 𝑘 𝑢 𝑘 ≤ 𝑀 ∗ + 𝑀 ∗ = 2𝑀 ∗ 1 ➢ 𝑀 𝑗 ≤ 𝑢 𝑘 ∗ + Fact 2 373F19 - Nisarg Shah & Karan Singh 15

  16. Makespan • Theorem [Graham 1966] ➢ Regardless of the order, greedy gives a 2 -approximation. • Is our analysis tight? ➢ Essentially. ➢ There is an example where greedy does perform this badly. ➢ Note: In the upcoming example, greedy is only as bad as 2 − 1/𝑛 , but you can also improve earlier analysis to show that greedy always gives 2 − 1/𝑛 approximation. ➢ So 2 − 1/𝑛 is exactly tight. 373F19 - Nisarg Shah & Karan Singh 16

  17. Makespan • Theorem [Graham 1966] ➢ Regardless of the order, greedy gives a 2 -approximation. • Is our analysis tight? ➢ Example: o 𝑛(𝑛 − 1) jobs of length 1 , followed by one job of length 𝑛 o Greedy evenly distributes unit length jobs on all 𝑛 machines, and assigning the last heavy job makes makespan 𝑛 − 1 + 𝑛 = 2𝑛 − 1 o Optimal makespan is 𝑛 by evenly distributing unit length jobs among 𝑛 − 1 machines and putting the single heavy job on the remaining ➢ Idea: It seems keeping heavy jobs at the end is bad. So just start with them first! 373F19 - Nisarg Shah & Karan Singh 17

  18. Makespan • Longest Processing Time (LPT) First ➢ Run the greedy algorithm but consider jobs in the decreasing order of their processing time • Need more facts about what the optimal cannot beat • Fact 3: If the bottleneck machine has only one job, then the solution is optimal. ➢ The optimal solution must schedule that job on some machine 373F19 - Nisarg Shah & Karan Singh 18

  19. Makespan • Longest Processing Time (LPT) First ➢ Run the greedy algorithm but consider jobs in the decreasing order of their processing time ➢ Suppose 𝑢 1 ≥ 𝑢 2 ≥ ⋯ ≥ 𝑢 𝑜 • Fact 4: If there are more than 𝑛 jobs, 𝑀 ∗ ≥ 2 ⋅ 𝑢 𝑛+1 ➢ Consider the first 𝑛 + 1 jobs ➢ All of them require processing time at least 𝑢 𝑛+1 ➢ By pigeonhole principle, in the optimal solution, at least two of them end up on the same machine 373F19 - Nisarg Shah & Karan Singh 19

  20. Makespan • Theorem ➢ Greedy with longest processing time first gives 3/2 - approximation • Proof: ➢ Similar to the proof for arbitrary ordering ➢ Consider bottleneck machine 𝑗 and job 𝑘 ∗ that was last scheduled on this machine by greedy ➢ Case 1: Machine 𝑗 has only one job 𝑘 ∗ o By Fact 3, greedy is optimal in this case (i.e. 1 -approximation) 373F19 - Nisarg Shah & Karan Singh 20

Recommend


More recommend