10 5 2016
play

10/5/2016 CSE373: Data Structures and Algorithms Asymptotic Analysis - PDF document

10/5/2016 CSE373: Data Structures and Algorithms Asymptotic Analysis (Big O, , and ) Steve Tanimoto Autumn 2016 This lecture material represents the work of multiple instructors at the University of Washington. Thank you to all who have


  1. 10/5/2016 CSE373: Data Structures and Algorithms Asymptotic Analysis (Big O,  , and  ) Steve Tanimoto Autumn 2016 This lecture material represents the work of multiple instructors at the University of Washington. Thank you to all who have contributed! CSE 373 Autumn 2016 2 Big-O: Common Names Comparing Function Growth (e.g., for Running Times) O (1) constant (same as O ( k ) for constant k ) • For a processor capable of one million instructions per second O ( log n ) logarithmic O ( n ) linear O( n log n ) “ n log n ” O ( n 2 ) quadratic O ( n 3 ) cubic O ( n k ) polynomial (where is k is any constant) O ( k n ) exponential (where k is any constant > 1) O( n !) factorial CSE 373 Autumn 2016 3 4 Efficiency Gauging Performance • Uh, why not just run the program and time it? • What does it mean for an algorithm to be efficient ? – Too much variability , not reliable or portable : – We care about time (and sometimes space ) • Hardware: processor(s), memory, etc. • Is the following a good definition? • Software: OS, Java version, libraries, drivers • Other programs running – “An algorithm is efficient if, when implemented, it • Implementation dependent runs quickly on real input instances” – Choice of input • Testing (inexhaustive) may miss worst-case input • Timing does not explain relative timing among inputs (what happens when n doubles in size) • Often want to evaluate an algorithm , not an implementation – Even before creating the implementation (“coding it up”) CSE 373 Autumn 2016 5 CSE 373 Autumn 2016 6 1

  2. 10/5/2016 Comparing Algorithms Analyzing Code (“worst case”) Basic operations take “some amount of” constant time When is one algorithm (not implementation ) better than another? – Arithmetic (fixed-width) – Various possible answers (clarity, security, …) – Assignment – But a big one is performance : for sufficiently large inputs, – Access one Java field or array index runs in less time (our focus) or less space – Etc. Large inputs - because probably any algorithm is “plenty good” for (This is an approximation of reality but practical.) small inputs (if n is 5, probably anything is fast) Control Flow Time required Answer will be independent of CPU speed, programming language, Sum of times of statements coding tricks, etc. Consecutive statements Sum of times of test and slower branch Conditionals Answer is general and rigorous, complementary to “coding it up Number of iterations  time of body Loops and timing it on some test cases” Time of called function’s body Calls (Solve a recurrence equation ) Recursion CSE 373 Autumn 2016 7 CSE 373 Autumn 2016 8 Analyzing Code Example 2 3 5 16 37 50 73 75 126 1. Add up time for all parts of the algorithm e.g. number of iterations = (n 2 + n)/2 2. Eliminate low-order terms i.e. eliminate n: (n 2 )/2 Find an integer in a sorted array 3. Eliminate coefficients i.e. eliminate 1/2: (n 2 ) Examples: // requires array is sorted  O(n) // returns whether k is in array – 4 n + 5 boolean find(int[]arr, int k){  O(n log n) – 0.5 n log n + 2 n + 7 ??? n 3 + 2 n + 3 n  O(2 n ) EXPONENTIAL GROWTH! – } n log (10 n 2 ) + 2n log (n)  O(n log n) – CSE 373 Autumn 2016 9 CSE 373 Autumn 2016 10 Linear Search Binary Search 2 3 5 16 37 50 73 75 126 2 3 5 16 37 50 73 75 126 Find an integer in a sorted array Find an integer in a sorted array – Can also be done non-recursively // requires array is sorted // requires array is sorted // returns whether k is in array // returns whether k is in array boolean find(int[]arr, int k){ boolean find(int[]arr, int k){ for(int i=0; i < arr.length; ++i) return help(arr,k,0,arr.length); } if(arr[i] == k) boolean help(int[]arr, int k, int lo, int hi) { return true; Best case: about 6 steps = O (1) int mid = (hi+lo)/2; // i.e., lo+(hi-lo)/2 return false; Worst case:  6*(arr.length) if(lo==hi) return false; } if(arr[mid]==k) return true;  O (arr.length) if(arr[mid]< k) return help(arr,k,mid+1,hi); else return help(arr,k,lo,mid); } CSE 373 Autumn 2016 11 CSE 373 Autumn 2016 12 2

  3. 10/5/2016 Binary Search Solving Recurrence Relations 1. Determine the recurrence relation. What is the base case? Best case: about 8 steps: O (1) – T ( n ) = 10 + T ( n /2) T (1) = 10 Worst case: T ( n ) = 10 + T ( n /2) where n is hi-lo 2. “Expand” the original relation to find an equivalent general • T(n)  O ( log n ) where n is array.length expression in terms of the number of expansions . • Solve recurrence equation to know that… – T ( n ) = 10 + 10 + T ( n /4) = 10 + 10 + 10 + T ( n /8) // requires array is sorted // returns whether k is in array = … boolean find(int[]arr, int k){ = 10k + T ( n /(2 k )) return help(arr,k,0,arr.length); 3. Find a closed-form expression by setting the number of } boolean help(int[]arr, int k, int lo, int hi) { expansions to a value which reduces the problem to a base case int mid = (hi+lo)/2; – n /(2 k ) = 1 implies n = 2 k implies k = log 2 n if(lo==hi) return false; – So T ( n ) = 10 log 2 n + 8 (get to base case and do it) if(arr[mid]==k) return true; – So T ( n ) is in O ( log n ) if(arr[mid]< k) return help(arr,k,mid+1,hi); else return help(arr,k,lo,mid); } CSE 373 Autumn 2016 13 CSE 373 Autumn 2016 14 Example Let’s try to “help” linear search Ignoring Constant Factors Run it on a computer 100x as fast (say 2015 model vs. 1990) Use a new compiler/language that is 3x as fast Be a clever programmer to eliminate half the work • So binary search's runtime is in O ( log n ) and linear's is in O ( n ) So doing each iteration is 600x as fast as in binary search – But which is faster? Note: 600x still helpful for problems without logarithmic algorithms! • Could depend on constant factors Runtime for (1/600)n) vs. log(n) with Various Input Sizes – How many assignments, additions, etc. for each n 12 • E.g. T(n) = 5,000,000n vs. T(n) = 5n 2 10 – And could depend on size of n 8 log(n) • E.g. T(n) = 5,000,000 + log n vs. T(n) = 10 + n Runtime 6 4 • But there exists some n 0 such that for all n > n 0 binary search wins sped up linear 2 • Let’s play with a couple plots to get some intuition… 0 100 300 500 700 900 1100 1300 1500 1700 1900 2100 2300 2500 Input Size (N) CSE 373 Autumn 2016 15 CSE 373 Autumn 2016 16 Another Example: sum array Runtime for (1/600)n) vs. log(n) with Various Input Sizes Two “obviously” linear algorithms: T ( n ) = c + T ( n -1) 25 20 int sum(int[] arr){ Iterative: int ans = 0; log(n) for(int i=0; i<arr.length; ++i) 15 ans += arr[i]; Runtime sped up linear return ans; 10 } 5 int sum(int[] arr){ Recursive: return help(arr,0); 0 – Recurrence is } int help(int[]arr,int i) { k + k + … + k Input Size (N) if(i==arr.length) for n times return 0; return arr[i] + help(arr,i+1); } CSE 373 Autumn 2016 17 CSE 373 Autumn 2016 18 3

  4. 10/5/2016 What About a Recursive Version? Parallelism Teaser int sum(int[] arr){ • But suppose we could do two recursive calls at the same time return help(arr,0,arr.length); } – Like having a friend do half the work for you! int help(int[] arr, int lo, int hi) { int sum(int[]arr){ if(lo==hi) return 0; return help(arr,0,arr.length); if(lo==hi-1) return arr[lo]; } int mid = (hi+lo)/2; int help(int[]arr, int lo, int hi) { return help(arr,lo,mid) + help(arr,mid,hi); if(lo==hi) return 0; } if(lo==hi-1) return arr[lo]; int mid = (hi+lo)/2; Recurrence is T ( n ) = 1 + 2 T ( n /2); T (1) = 1 return help(arr,lo,mid) + help(arr,mid,hi); } 1 + 2 + 4 + 8 + … for log n times 2 (log n) – 1 which is proportional to n (definition of logarithm) • If you have as many “friends of friends” as needed the recurrence is now T(n) = c + 1 T(n/ 2 ) Easier explanation: it adds each number once while doing little else – O ( log n ) : same recurrence as for find “Obvious”: We can’t do better than O(n) : we have to read whole array CSE 373 Autumn 2016 19 CSE 373 Autumn 2016 20 Common Recurrences Asymptotic Notation Should know how to solve recurrences but also recognize some About to show formal definition, which amounts to saying: really common ones: 1. Eliminate low-order terms 2. Eliminate coefficients T ( n ) = c + T ( n -1) linear T ( n ) = c + 2 T ( n /2) linear Examples: T ( n ) = c + T ( n /2) logarithmic: O ( log n) – 4 n + 5 T ( n ) = c + 2 T ( n -1) exponential – 0.5 n log n + 2 n + 7 T ( n ) = c n + T ( n -1) quadratic n 3 + 2 n + 3 n T ( n ) = c n + T(n/2) linear – n log (10 n 2 ) T ( n ) = c n + 2 T ( n /2) O (n log n) – Note big-O can also use more than one variable • Example: can sum all elements of an n -by- m matrix in O ( nm ) CSE 373 Autumn 2016 21 CSE 373 Autumn 2016 22 Big-O Relates Functions Big-O, Formally Definition: We use O on a function f( n ) (for example n 2 ) to mean the set of functions with asymptotic behavior "less than or equal to" f( n ) f( n ) is in O( g( n ) ) if there exist constants c and n 0 such that f( n )  c g( n ) for all n  n 0 For example, (3 n 2 +17) is in O ( n 2 ) • To show f( n ) is in O( g( n ) ), pick a c large enough to “cover the Confusingly, some people also say/write: constant factors” and n 0 large enough to “cover the lower-order terms” – (3 n 2 +17) is O ( n 2 ) – Example: Let f( n ) = 3 n 2 +17 and g( n ) = n 2 – (3 n 2 +17) = O ( n 2 ) c =5 and n 0 =10 will work. But we should never say O ( n 2 ) = (3 n 2 +17) • This is “less than or equal to” – So 3 n 2 +17 is also in O ( n 5 ) and in O (2 n ) etc. CSE 373 Autumn 2016 23 CSE 373 Autumn 2016 24 4

Recommend


More recommend