cse 326 data structures lecture 4 heaps more priority qs
play

CSE326:DataStructures Lecture#4 HeapsmorePriorityQs BartNiswonger - PDF document

CSE326:DataStructures Lecture#4 HeapsmorePriorityQs BartNiswonger SummerQuarter2001 TodaysOutline Returnquizzes ThingsBartDidntFinishonFriday (insert&d-Heaps)


  1. CSE�326:�Data�Structures Lecture�#4 Heaps�more�Priority�Qs Bart�Niswonger Summer�Quarter�2001 Today’s�Outline • Return�quizzes • Things�Bart�Didn’t�Finish�on�Friday� (insert�&�d-Heaps) • Leftist�Heaps • Skew�Heaps • Comparing�Heaps 1

  2. Priority�Queue�ADT • Priority�Queue�operations – create F(7) E(5) – destroy deleteMin insert G(9) D(100) A(4) C(3) – insert B(6) – deleteMin – is_empty • Priority�Queue�property:�for�two�elements�in� the�queue,� x and� y ,�if� x has�a�lower�priority� value than� y ,� x will�be�deleted�before� y Nifty�Storage�Trick 1 2 • Calculations: – child: 2 3 4 5 – parent: 4 7 5 6 7 6 10 8 – root: 8 9 11 9 12 14 20 – next�free: 12 10 11 0 1 2 3 4 5 6 7 8 9 10 11 12 12 2 4 5 7 6 10 8 11 9 12 14 20 2

  3. DeleteMin pqueue.deleteMin() 2 2 ? 4 5 4 5 7 6 10 8 7 6 10 8 11 9 12 14 20 11 9 12 14 20 Insert pqueue.insert(3) 2 2 4 5 4 5 7 6 10 8 7 6 10 8 11 9 12 14 20 11 9 12 14 20 ? 3

  4. Percolate�Up 2 2 4 5 4 5 3 7 6 10 8 7 6 ? 8 3 11 9 12 14 20 ? 11 9 12 14 20 10 2 2 3 4 ? 4 3 7 6 5 8 7 6 5 8 11 9 12 14 20 10 11 9 12 14 20 10 Insert�Code void�insert(Object�o)�{ int�percolateUp(int�hole,� Object�val)�{ assert(!isFull()); while�(hole�>�1�&& size++; val�<�Heap[hole/2]) newPos�= Heap[hole]�=�Heap[hole/2]; hole�/=�2; percolateUp(size,o); } Heap[newPos]�=�o; return�hole; } } runtime: 4

  5. Other�Priority�Queue� Operations • decreaseKey� – given�the�position�of�an�object�in�the�queue,� reduce�its�priority�value • increaseKey – given�the�position�of�an�an�object�in�the�queue,� increase�its�priority�value • remove – given�the�position�of�an�object�in�the�queue,� remove�it • buildHeap – given�a�set�of�items,�build�a�heap DecreaseKey,�IncreaseKey,� and�Remove void�decreaseKey(int obj)�{ void�remove(int�obj)�{ assert(size�>= obj); assert(size�>=�obj); percolateUp(obj,� temp�=�Heap[obj]; NEG_INF_VAL); deleteMin(); newPos�= percolateUp(obj,�temp); } Heap[newPos]�=�temp; } void�increaseKey(int�obj)�{ assert(size�>=�obj); temp�=�Heap[obj]; newPos�=�percolateDown(obj,�temp); Heap[newPos]�=�temp; } 5

  6. BuildHeap Floyd’s�Method.�Thank�you,�Floyd. 12 5 11 3 10 6 9 4 8 1 7 2 pretend�it’s�a�heap�and�fix�the�heap-order�property! 12 5 11 3 10 6 9 4 8 1 7 2 Build(this)Heap 12 12 5 11 5 11 3 10 2 9 3 1 2 9 4 8 1 7 6 4 8 10 7 6 12 12 5 2 1 2 3 1 6 9 3 5 6 9 4 8 10 7 11 4 8 10 7 11 6

  7. Finally…� 1 3 2 4 5 6 9 12 8 10 7 11 runtime: Thinking�about�Heaps • Observations – finding�a�child/parent�index�is�a�multiply/divide�by� two – operations�jump�widely�through�the�heap – each�operation�looks�at�only�two�new�nodes – inserts�are�at�least�as�common�as�deleteMins • Realities – division�and�multiplication�by�powers�of�two�are� fast – looking�at�one�new�piece�of�data�sucks�in�a�cache� line – with� huge data�sets,�disk�accesses�dominate 7

  8. Solution:�d-Heaps • Each�node�has� d children 1 • Still�representable�by� 3 7 2 array • Good�choices�for� d : 4 8 5 12 11 10 6 9 – optimize�performance�based�on� #�of�inserts/removes – choose�a�power�of�two�for� 12 1 3 7 2 4 8 5 121110 6 9 efficiency – fit�one�set�of�children�in�a�cache� line – fit�one�set�of�children�on�a� memory�page/disk�block One�More�Operation • Merge�two�heaps.�Ideas? 8

  9. Merge Given�two�heaps,�merge�them�into�one� heap – first�attempt:�insert�each�element�of�the� smaller�heap�into�the�larger.� runtime: – second�attempt:�concatenate�heaps’�arrays� and�run�buildHeap. runtime: How�about�O(log�n)�time? Idea:�Hang�a�New�Tree 2 1 + = 5 11 9 4 6 10 13 12 14 10 Now,�just� percolate�down! ? 2 1 5 11 9 4 6 10 13 12 14 10 9

  10. Idea:�Hang�a�New�Tree 2 2 + = 5 11 5 11 6 10 13 12 6 10 13 12 13 12 13 12 Problem? Leftist�Heaps • Idea:� make�it�so�that�all�the�work�you� have�to�do�in�maintaining�a�heap� is�in�one�small�part • Leftist�heap: – almost�all�nodes�are�on�the�left – all�the�merging�work�is�on�the�right 10

  11. Random�Definition: Null�Path�Length the� null�path�length�(npl) of�a�node�is�the�number� of�nodes�between�it�and�a�null�in�the�tree • npl(null)�=�-1 2 • npl(leaf)�=�0 1 1 • npl(single-child� node)�=�0 0 1 0 0 another�way�of�looking�at�it: 0 0 0 npl�is�the�height�of�complete� subtree�rooted�at�this�node Leftist�Heap�Properties • Heap-order�property – parent’s�priority�value�is� � to�childrens’ priority� values – result:�minimum�element�is�at�the�root • Leftist�property – null�path�length�of�left�subtree is� � npl�of�right� subtree – result:�tree�is�at�least�as�“heavy”�on�the�left�as� the�right Are�leftist�trees�complete?�Balanced? 11

  12. Leftist�tree�examples NOT leftist leftist leftist 2 2 0 1 1 1 1 0 0 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 every�subtree�of�a�leftist� tree�is�leftist,�comrade! 0 Right�Path�in�a�Leftist�Tree�is�Short • If�the�right�path�has�length�at�least 2 r ,�the�tree�has�at�least� 2 r -1 nodes 1 1 • Proof�by�induction 0 0 1 0 Basis:� r�=�1 .�Tree�has�at�least one�node:� 2 1 - 1�=�1 0 0 0 Inductive�step:�assume�true�for� r’�<�r .�The�right�subtree�has�a� right�path�of�at�least�r�- 1�nodes,�so�it�has�at�least� 2 r�- 1 - 1 nodes.�The�left�subtree�must�also�have�a�right�path�of�at�least r�- 1 (otherwise,�there�is�a�null�path�of� r�- 3 ,�less�than�the� right�subtree).�Again,�the�left�has� 2 r�- 1 - 1 nodes.�All�told�then,� there�are�at�least: 2 r�- 1 - 1�+�2 r�- 1 - 1�+�1�=�2 r - 1 • So,�a�leftist�tree�with�at�least� n nodes�has�a�right� path�of�at�most� log�n nodes 12

  13. Whew! To�Do • Unix�development�Tutorial – Tuesday�– 10:50am�– Sieg�322 • Finish�Project�I�for�Wednesday • Read�chapters�1�&�2 13

  14. Coming�Up • Theory! • Proof�by�Induction • Asymptotic�Analysis • Quiz�#2�(Thursday) 14

Recommend


More recommend