cs4102 algorithms
play

CS4102 Algorithms Summer 2020 Warm up Why is an algorithms space - PowerPoint PPT Presentation

CS4102 Algorithms Summer 2020 Warm up Why is an algorithms space complexity (how much memory it uses) important? Why might a memory- intensive algorithm be a bad one? 1 Why lots of memory is bad 2 Greedy Algorithms Require


  1. CS4102 Algorithms Summer 2020 Warm up Why is an algorithm’s space complexity (how much memory it uses) important? Why might a memory- intensive algorithm be a “bad” one? 1

  2. Why lots of memory is “bad” 2

  3. Greedy Algorithms • Require Optimal Substructure – Solution to larger problem contains the solution to a smaller one – Only one subproblem to consider! • Idea: 1. Identify a greedy choice property • How to make a choice guaranteed to be included in some optimal solution 2. Repeatedly apply the choice property until no subproblems remain 3

  4. Exchange argument • Shows correctness of a greedy algorithm • Idea: – Show exchanging an item from an arbitrary optimal solution with your greedy choice makes the new solution no worse – How to show my sandwich is at least as good as yours: • Show: “I can remove any item from your sandwich, and it would be no worse by replacing it with the same item from my sandwich” 4

  5. Von Neumann Bottleneck • Named for John von Neumann • Inventor of modern computer architecture • Other notable influences include: – Mathematics – Physics – Economics – Computer Science 5

  6. Von Neumann Bottleneck • Reading from memory is VERY slow • Big memory = slow memory • Solution: hierarchical memory • Takeaway for Algorithms: Memory is time, more memory is a lot more time Hope it’s not here If not look here Hopefully your data in here Disk Cache CPU, registers Access time: Access time: Access time: 1,000,000 cycles 6 1 cycle 10 cycles

  7. Caching Problem • Cache misses are very expensive • When we load something new into cache, we must eliminate something already there • We want the best cache “schedule” to minimize the number of misses 7

  8. Caching Problem Definition • Input: – 𝑙 = size of the cache – 𝑁 = 𝑛 1 , 𝑛 2 , … 𝑛 𝑜 = memory access pattern • Output: – “schedule” for the cache (list of items in the cache at each time) which minimizes cache fetches 8

  9. Example A B C A B C D A D E A D B A E C E A 9

  10. Example A A B B C C A B C D A D E A D B A E C E A 10

  11. Example A A A B B B C C C A B C D A D E A D B A E C E A 11

  12. Example We must evict A A A A something to make B B B B room for D C C C C A B C D A D E A D B A E C E A 12

  13. Example If we evict A A A A A D B B B B B C C C C C A B C D A D E A D B A E C E A 13

  14. Example If we evict C A A A A A B B B B B C C C C D A B C D A D E A D B A E C E A 14

  15. Our Problem vs Reality • Assuming we know the entire access pattern • Cache is Fully Associative • Counting # of fetches (not necessarily misses) • “Reduced” Schedule: Address only loaded on the cycle it’s required – Reduced == Unreduced (by number of fetches) A A D D Unreduced B B B B C C C C A B C D A D E A D B A E C E A Leaving A in longer does A A A A Reduced not save fetches B B B B C C C C A B C D A D E A D B A E C E A 15

  16. Greedy Algorithms • Require Optimal Substructure – Solution to larger problem contains the solution to a smaller one – Only one subproblem to consider! • Idea: 1. Identify a greedy choice property • How to make a choice guaranteed to be included in some optimal solution 2. Repeatedly apply the choice property until no subproblems remain 16

  17. Greedy choice property • Belady evict rule: – Evict the item accessed farthest in the future A A A A B B B B C C C C Evict C A B C D A D E A D B A E C E A 17

  18. Greedy choice property • Belady evict rule: – Evict the item accessed farthest in the future A A A A A A A B B B B B B B C C C D D D D Evict B A B C D A D E A D B A E C E A 18

  19. Greedy choice property • Belady evict rule: – Evict the item accessed farthest in the future A A A A A A A A A A B B B B B B E E E E C C C D D D D D D D Evict D A B C D A D E A D B A E C E A 19

  20. Greedy choice property • Belady evict rule: – Evict the item accessed farthest in the future A A A A A A A A A A A A A B B B B B B E E E E E E E C C C D D D D D D B B B B Evict B A B C D A D E A D B A E C E A 20

  21. Greedy choice property • Belady evict rule: – Evict the item accessed farthest in the future A A A A A A A A A A A A A A A B B B B B B E E E E E E E E E C C C D D D D D D B B B C C C A B C D A D E A D B A E C E A 4 Cache Misses 21

  22. Greedy Algorithms • Require Optimal Substructure – Solution to larger problem contains the solution to a smaller one – Only one subproblem to consider! • Idea: 1. Identify a greedy choice property • How to make a choice guaranteed to be included in some optimal solution 2. Repeatedly apply the choice property until no subproblems remain 22

  23. Caching Greedy Algorithm Initialize 𝑑𝑏𝑑ℎ𝑓 = first k accesses 𝑃(𝑙) For each 𝑛 𝑗 ∈ 𝑁 : 𝑜 times if 𝑛 𝑗 ∈ 𝑑𝑏𝑑ℎ𝑓 : 𝑃(𝑙) print 𝑑𝑏𝑑ℎ𝑓 𝑃(𝑙) else: 𝑛 = furthest-in-future from cache 𝑃(𝑙𝑜) evict 𝑛 , load 𝑛 𝑗 𝑃(1) print 𝑑𝑏𝑑ℎ𝑓 𝑃(𝑙) 𝑃(𝑙𝑜 2 ) 23

  24. Exchange argument • Shows correctness of a greedy algorithm • Idea: – Show exchanging an item from an arbitrary optimal solution with your greedy choice makes the new solution no worse – How to show my sandwich is at least as good as yours: • Show: “I can remove any item from your sandwich, and it would be no worse by replacing it with the same item from my sandwich” 24

  25. Belady Exchange Lemma Let 𝑇 𝑔𝑔 be the schedule chosen by our greedy algorithm Let 𝑇 𝑗 be a schedule which agrees with 𝑇 𝑔𝑔 for the first 𝑗 memory accesses. We will show: there is a schedule 𝑇 𝑗+1 which agrees with 𝑇 𝑔𝑔 for the first 𝑗 + 1 memory accesses, and has no more misses than 𝑇 𝑗 (i.e. 𝑛𝑗𝑡𝑡𝑓𝑡 𝑇 𝑗+1 ≤ 𝑛𝑗𝑡𝑡𝑓𝑡(𝑇 𝑗 ) ) Optimal Greedy Lemma Lemma Lemma Lemma 𝑇 ∗ … 𝑇 𝑔𝑔 𝑇 1 𝑇 2 Agrees with Agrees with Agrees with Agrees with 𝑇 𝑔𝑔 on first 2 𝑇 𝑔𝑔 on all 𝑜 𝑇 𝑔𝑔 on first 0 𝑇 𝑔𝑔 on first 25 accesses accesses access accesses

  26. Belady Exchange Proof Idea First 𝑗 accesses 𝑇 𝑗 Need to fill in the rest 𝑇 𝑗+1 of 𝑇 𝑗+1 to have no more misses than 𝑇 𝑗 Must agree with 𝑇 𝑔𝑔 𝑇 𝑔𝑔 26

  27. Proof of Lemma Goal: find 𝑇 𝑗+1 s.t. 𝑛𝑗𝑡𝑡𝑓𝑡 𝑇 𝑗+1 ≤ 𝑛𝑗𝑡𝑡𝑓𝑡(𝑇 𝑗 ) Since 𝑇 𝑗 agrees with 𝑇 𝑔𝑔 for the first 𝑗 accesses, the state of the cache at access 𝑗 + 1 will be the same = 𝑒 𝑓 𝑔 𝑒 𝑓 𝑔 𝑇 𝑗 Cache after 𝑗 𝑇 𝑔𝑔 Cache after 𝑗 Consider access 𝑛 𝑗+1 = 𝑒 Case 1: if 𝑒 is in the cache, then neither 𝑇 𝑗 nor 𝑇 𝑔𝑔 evict from the cache, use the same cache for 𝑇 𝑗+1 𝑒 𝑓 𝑔 𝑇 𝑗+1 Cache after 𝑗 27

  28. Proof of Lemma Goal: find 𝑇 𝑗+1 s.t. 𝑛𝑗𝑡𝑡𝑓𝑡 𝑇 𝑗+1 ≤ 𝑛𝑗𝑡𝑡𝑓𝑡(𝑇 𝑗 ) Since 𝑇 𝑗 agrees with 𝑇 𝑔𝑔 for the first 𝑗 accesses, the state of the cache at access 𝑗 + 1 will be the same = 𝑓 𝑔 𝑓 𝑔 𝑇 𝑗 Cache after 𝑗 𝑇 𝑔𝑔 Cache after 𝑗 Consider access 𝑛 𝑗+1 = 𝑒 Case 2: if 𝑒 isn’t in the cache, and both 𝑇 𝑗 and 𝑇 𝑔𝑔 evict 𝑔 from the cache, evict 𝑔 for 𝑒 in 𝑇 𝑗+1 𝑓 𝑒 𝑇 𝑗+1 Cache after 𝑗 28

  29. Proof of Lemma Goal: find 𝑇 𝑗+1 s.t. 𝑛𝑗𝑡𝑡𝑓𝑡 𝑇 𝑗+1 ≤ 𝑛𝑗𝑡𝑡𝑓𝑡(𝑇 𝑗 ) Since 𝑇 𝑗 agrees with 𝑇 𝑔𝑔 for the first 𝑗 accesses, the state of the cache at access 𝑗 + 1 will be the same = 𝑓 𝑔 𝑓 𝑔 𝑇 𝑗 Cache after 𝑗 𝑇 𝑔𝑔 Cache after 𝑗 Consider access 𝑛 𝑗+1 = 𝑒 Case 3: if 𝑒 isn’t in the cache, 𝑇 𝑗 evicts 𝑓 and 𝑇 𝑔𝑔 evicts 𝑔 from the cache ≠ 𝑒 𝑔 𝑓 𝑒 𝑇 𝑗 Cache after 𝑗 + 1 𝑇 𝑔𝑔 Cache after 𝑗 + 1 29

  30. Case 3 First 𝑗 accesses 𝑇 𝑗 Need to fill in the rest 𝑇 𝑗+1 of 𝑇 𝑗+1 to have no more misses than 𝑇 𝑗 Must agree with 𝑇 𝑔𝑔 𝑇 𝑔𝑔 30

  31. Case 3 First 𝑗 accesses 𝑇 𝑗 Copy 𝑇 𝑗 𝑇 𝑗+1 𝑛 𝑢 First place 𝑇 𝑗 involves 𝑓 or 𝑔 𝑇 𝑔𝑔 𝑛 𝑢 = the first access after 𝑗 + 1 in which 𝑇 𝑗 deals with 𝑓 or 𝑔 3 options: 𝒏 𝒖 = 𝒇 or 𝒏 𝒖 = 𝒈 or 𝒏 𝒖 = 𝒚 ≠ 𝒇, 𝒈 31

  32. Case 3, 𝑛 𝑢 = 𝑓 First 𝑗 accesses 𝑇 𝑗 Copy 𝑇 𝑗 𝑇 𝑗+1 𝑓 First place 𝑇 𝑗 uses 𝑓 or 𝑔 𝑇 𝑔𝑔 𝑛 𝑢 = the first access after 𝑗 + 1 in which 𝑇 𝑗 deals with 𝑓 or 𝑔 3 options: 𝒏 𝒖 = 𝒇 or 𝒏 𝒖 = 𝒈 or 𝒏 𝒖 = 𝒚 ≠ 𝒇, 𝒈 32

Recommend


More recommend