Algorithms in a Nutshell Session 4 Recap Algorithm Themes 11:20 – 11:40
Outline • Data Structures – Array, Linked List, Queue, Heap, Priority Queue, Tree, Graph • Space vs. Time tradeoff • Approaches – Divide and conquer – Greedy algorithm Algorithms in a Nutshell (c) 2009, George T. Heineman 2
Common Data Structures • Basic Structures Indexed access Structure Glyph Insert Delete Get i th Set i th Find Array O(n) O(n) O(1) O(1) O(n) Linked List O(1) O(n) O(n) O(n) O(n) Stack O(1) O(1) ‐‐‐‐ ‐‐‐‐ ‐‐‐‐ Queue O(1) O(1) ‐‐‐‐ ‐‐‐‐ ‐‐‐‐ Linked List Stack Queue insert adds to front insert is push Insert adds to one end insert adds to tail remove is pop remove extracts from other end remove from any location Algorithms in a Nutshell (c) 2009, George T. Heineman 3
Dynamic vs. Static sizes • Fixed size allocation via arrays idx=4 tail=2 head=6 a t e g Stack n=8 Queue n=8 • Increase size by allocating int oldCapacity = table.length; Entry[] oldMap = table; more memory int newCapacity = oldCapacity * 2 + 1; – Don’t increase by fixed Entry[] newMap = new Entry[newCapacity]; amount, but double table = newMap; – If you only add linear amount ... each time, too inefficient Algorithms in a Nutshell (c) 2009, George T. Heineman 4
Binary Heap • Heap can be stored in array – Fixed maximum size – Assumes you only remove elements Structure Glyph Insert Remove Max Find Binary Heap O(log n) O(log n) ‐‐ 16 Level 0 10 14 Level 1 Level 2 02 03 05 16 10 14 02 03 05 Algorithms in a Nutshell (c) 2009, George T. Heineman 5
Priority Queue • Most implementations provide only – insert (element, priority) – getMinimum() • If you only need these two operations, Binary Heap can be used • Often need one more method – decreaseKey (element, newPriority) – If you need this one also, you must adjust data structure Structure Glyph Insert Remove Max Contains DecreaseKey Priority Queue O(log n) O(log n) O(log n) O(log n) Algorithms in a Nutshell (c) 2009, George T. Heineman 6
Balanced Binary Tree • Ideal dynamic data structure – No need to know maximum size in advance – Red/Black Tree implementations quite common • Avoids worst case behavior – Which might degenerate to O(n) for all operations Structure Glyph Insert Delete Find Tree O(log n) O(log n) O(log n) Balanced Binary Tree O(log n) O(log n) O(log n) Algorithms in a Nutshell (c) 2009, George T. Heineman 7
Implementation Tradeoff • Algorithm designers have developed innovative data structures – Fibonacci Heaps – Skip lists – Splay trees • Theoretical improvement is offset by more complicated implementations – Also improvement is “amortized” over life of use – Some operations may be worse than expected Algorithms in a Nutshell (c) 2009, George T. Heineman 8
Divide and Conquer Words to search 1,048,576 • Intuition why it works so well 524,288 262,144 – Look for word in Dictionary 131,072 65,536 – Each iteration discards half of 32,768 16,384 remaining words during search 8,192 • Number of iterations 4,096 2,048 1,024 – log 2 n = log n throughout book 512 256 – O(log n) family 128 • Clearly much better than linear 64 32 scan of n elements 16 8 4 2 1 Algorithms in a Nutshell (c) 2009, George T. Heineman 9
30 Divide and Conquer • Also applies to composed problems – QUICKSORT SortTime(15) j t u d m o p e w a h r b c x SortTime(6) + d e a h b c j t w u x r m o p SortTime(8) SortTime(15) = TimePartition(15) +SortTime(6) + SortTime(8) T(n) = O(n) + 2*T(n/2) Continues k=log n times T(n) = 2*O(n) + 4*T(n/4) T(n) = log n*O(n) + O(n) T(n) = 3*O(n) + 8*T(n/8) T(n) = O(n * log n) T(n) = k*O(n) + 2 k *T(n/2 k ) Algorithms in a Nutshell (c) 2009, George T. Heineman 10
Greedy Algorithm • Goal is to solve problem of size n – Single‐Source Shortest Path from s to all vertices v i – DIJKSTRA’S Algorithm • Make locally optimal decision at each stage – Apply until result yields globally optimal solution 1 1 1 1 1 1 1 1 1 1 3 3 3 3 3 3 2 2 2 2 2 2 2 2 2 2 4 4 4 4 4 4 4 4 4 4 1 1 1 1 1 1 1 1 1 1 0 0 0 0 4 4 2 2 0 0 4 4 2 2 0 0 4 4 2 2 0 4 4 2 2 0 4 4 2 2 7 7 7 7 7 7 7 7 7 7 8 8 8 8 8 8 8 8 8 8 5 5 5 5 5 5 5 5 5 5 3 3 3 3 3 3 3 3 3 3 dist dist dist dist 0 0 2 2 ∞ ∞ ∞ ∞ 4 4 0 0 2 2 5 5 ∞ ∞ 4 4 0 0 2 2 5 11 4 5 11 4 0 0 2 2 5 10 4 5 10 4 0 0 ∞ ∞ ∞ ∞ ∞ ∞ ∞ ∞ dist visited visited visited visited visited Algorithms in a Nutshell (c) 2009, George T. Heineman 11
165 Dynamic Programming • Goal is to solve problem of size n – All Pairs Shortest Path between any vertices (v i , v j ) – FLOYD‐WARSHALL Algorithm 1 2 1 3 • Solve most constrained problems first 4 1 0 2 0 2 4 4 7 5 – Relax constraints systematically until done 8 3 3 Shortest distance Shortest path can now Shortest path can now Shortest path can now Final result shown below considering just initial edges include vertex 0 include vertices 0 + 1 include vertices 0 + 1 + 2 0 1 2 3 4 0 0 1 1 2 2 3 3 4 4 0 0 1 1 2 2 3 3 4 4 0 0 1 1 2 2 3 3 4 4 0 0 1 1 2 2 3 3 4 4 0 0 2 ∞ ∞ 4 0 0 0 0 2 2 ∞ ∞ ∞ ∞ 4 4 0 0 0 0 2 2 5 5 ∞ ∞ 4 4 0 0 0 0 2 2 5 5 10 10 4 4 0 0 0 0 2 2 5 5 10 10 4 4 1 ∞ 0 3 ∞ ∞ 1 1 ∞ ∞ 0 0 3 3 ∞ ∞ ∞ ∞ 1 1 ∞ ∞ 0 0 3 3 ∞ ∞ ∞ ∞ 1 1 ∞ ∞ 0 0 3 3 8 8 4 4 1 1 16 16 0 0 3 3 8 8 4 4 2 ∞ ∞ 0 5 1 2 2 ∞ ∞ ∞ ∞ 0 0 5 5 1 1 2 2 ∞ ∞ ∞ ∞ 0 0 5 5 1 1 2 2 ∞ ∞ ∞ ∞ 0 0 5 5 1 1 2 2 13 13 15 15 0 0 5 5 1 1 3 8 ∞ ∞ 0 ∞ 3 3 8 8 10 10 ∞ ∞ 0 0 12 12 3 3 8 8 10 10 13 13 0 0 12 12 3 3 8 8 10 10 13 13 0 0 12 12 3 3 8 8 10 10 13 13 0 0 12 12 4 ∞ ∞ ∞ 7 0 4 4 ∞ ∞ ∞ ∞ ∞ ∞ 7 7 0 0 4 4 ∞ ∞ ∞ ∞ ∞ ∞ 7 7 0 0 4 4 ∞ ∞ ∞ ∞ ∞ ∞ 7 7 0 0 4 4 15 15 17 17 20 20 7 7 0 0 dist[u][v] 1 2 3 1 1 2 3 2 3 4 1 0 2 4 4 1 4 1 0 2 0 2 4 4 7 5 8 Algorithms in a Nutshell (c) 2009, George T. Heineman 7 7 12 5 8 8 5 3 3 3
Summary • Various data structures investigated • Various approaches described – Divide and conquer – Greedy algorithm – Dynamic programming Algorithms in a Nutshell (c) 2009, George T. Heineman 13
Recommend
More recommend