2/4/2011 Algorithm An algorithm is a clearly specified set of Analysis of instructions a computer follows to solve a Algorithms problem The number of instructions is finite Each instruction must be executable in a finite amount of time Chapter 2 Each instruction must be unambigous CPTR 318 1 2 Algorithm Analysis: Technique #1 Algorithm Analysis: Technique #2 Performance could be analyzed by using: Actual Time requirements Actual Space requirements The above method depends on the particular compiler as well as the specific computer on Instruction and Data space which the program is run 3 4 Algorithm Analysis: Technique #3 Counting Steps One way to analyze algorithms is to count all If the algorithm does not have loops that the instructions or steps in the algorithm depend on the number of elements to be processed then the number of steps is a Generally we discuss the algorithm’s constant. efficiency as a function of the number of f ( n ) = c elements to be processed. The general format that we will use is c is a constant f ( n ) = efficiency 5 6 1
2/4/2011 Counting Steps: Example #1 Counting Steps Instructions Total Steps If the algorithm has only sequential int f(int x) int f(int x) { 1 instructions and simple counting loops and at { int c, result; 1 int c, result; least one loop depends on the number of c = x + 5; c = x + 5; 1 elements to be processed then if (c > 10) if (c > 10) 1 result = c; f(n) = an + b , where a and b are constants result = c; 0 or 1 else else Example: Sequential search result = x; result = x; 1 or 0 return result; return result; } } 1 f(n) = 6 7 8 Counting Steps: Example #2 Counting Steps Instructions Total Steps If the algorithm contains in addition to the int f(int n) 1 previous slide a nested counting loop where { int f(int n) both loops depend on the number of { int i = 1, 1 int i = 1, elements to be processed then s = 0; 1 s = 0; f ( n ) = an 2 + bn + c while ( i <= n ) { n+1 while ( i <= n ) { s += i; s += i; n Example: Selection Sort i++; i++; In general a polynomial efficiency depends } n } return s; on the number of nested loops present: return s; } 1 f ( n ) = a m n m + a m -1 n m -1 + a m -2 n m -2 + … a 1 n + a 0 } f(n) = 3n+5 9 10 Counting Steps Best, Worst, Average • Logarithmic loops. When counting the steps for the efficiency – These are algorithms whose efficiency contain the function we have sometimes to consider the log function best, worst and average cases – Example: Binary Search Example: Sequential search while ( n > 0 ) { Application code … n = n / 2 } – f ( n ) = a log 2 n + c 11 12 2
2/4/2011 Growth Rates Algorithm Analysis: Technique #4 Big- O notation gives a general order of n f ( n ) = n 2 f ( n ) = n 2 + 4 n + 20 magnitude to compare algorithms. It capture the most dominant term in a function 10 100 160 It gives us an upper limit to compare the 100 10,000 10,420 algorithms 10,000 100,000,000 100,040,020 Classify algorithms as belonging to a family of algorithms 14 13 Big- O Definition Examples Consider f ( n ) = 3 n + 2 . f ( n ) = O ( g ( n )) iff positive constants c and n 0 exist such that: f ( n ) = 3 n + 2 ≤ 3 n + 2 n = 5 n , for all n ≥ 1 . f ( n ) ≤ cg ( n ) for all n ≥ n 0 Therefore f ( n ) = O ( n ) 15 16 Examples Examples Example 2: Is 2 n +2 = O (2 n ) ? Prove that 10 n 2 + 4 n + 2 ≠ O ( n ) . Suppose 10 n 2 + 4 n + 2 = O ( n ) then there exists a positive c and a n 0 such that Example 3: Is 3 n + 2 = O ( n 2 ) ? 10 n 2 + 4 n + 2 ≤ cn , for all n ≥ n 0 . Dividing both sides by n we get 10 n + 4 + 2/ n ≤ c for all n ≥ n 0 This is a false statement because as n ∞ , 10 n + 4 + 2/ n ∞ which cannot be less than c . Therefore 10 n 2 + 4 n + 2 ≠ O ( n ) . 17 18 3
2/4/2011 Helpful Theorems Example Example 1: 3 n + 2 = O( n ) because as n ∞ Theorem1 : if f ( n ) = a m n m + .. a 1 n + a 0 and a m > 0 then f ( n ) = O( n m ) (3 n + 2)/ n 3 . Example 2: 3 n 2 + 5 ≠ O( n ) because as n ∞ Theorem2 (Big O ratio theorem): Let f ( n ) and g ( n ) be such that lim n →∞ f ( n ) / g ( n ) exists. (3 n 2 + 5 )/ n ∞ . f ( n ) = O( g ( n )) iff lim n →∞ f ( n ) / g ( n ) ≤ c for some finite positive constant c . 19 20 Bit-Omega Definition Big Theta Definition f ( n ) = Ω ( g ( n )) iff positive constants c f ( n ) = Θ( g ( n )) iff positive constants c 1 , and n 0 exist such that: c 2 , and n 0 exist such that: cg ( n ) ≤ f ( n ) for all n ≥ n 0 . c 1 g ( n ) ≤ f ( n ) ≤ c 2 g ( n ) for all n ≥ n 0 . 21 22 Growth Function graphs Example Example 1: Prove that f ( n ) = 3 n + 2 = Θ( n ) We have already shown that f ( n ) = O( n ). We just need to prove that f ( n ) is Ω ( n ). That is to show that cg ( n ) ≤ f ( n ) n ≥ n 0 . This is easy because n ≤ 3 n + 2 for all n ≥ 0 Example 2: Prove that 3 n + 3 ≠ Θ( n 2 ) 23 24 4
2/4/2011 More Helpful Theorems Example Theorem : if f ( n ) = a m n m + .. a 1 n + a 0 3 n + 2 = Θ( n ) because as n ∞ and a m > 0 then f ( n ) = Θ ( n m ) (3 n + 2)/ n = 3 and as n ∞ n /(3 n + 2) = 1/3 ≤ 3 . Theorem (Ratio for Θ ) : Let f ( n ) and g ( n ) be such that lim n →∞ f ( n ) / g ( n ) and lim n →∞ g ( n ) / f ( n ) exist then f ( n ) = Θ ( g ( n )) iff lim n →∞ f ( n ) / g ( n ) ≤ c and lim n →∞ g ( n ) / f ( n ) ≤ c for some finite positive constant c . 25 26 Meaning of the various growth Little o Definition functions f ( n ) = o ( g ( n )) iff f ( n ) = O ( g ( n )) and Mathematical Expression Relative Rates of Growth f(n) = O(g(n)) f(n) ≤ g(n) f ( n ) ≠ Θ( g ( n )) f(n) = Ω (g(n)) f(n) ≥ g(n) f(n) = Θ(g(n)) f(n) = g(n) f(n) = o(g(n)) f(n) < g(n) 27 28 Common asymptotic functions Graph of Asymptotic functions In order of magnitude 1 1. log n 2. n 3. n log n 4. n 2 5. n 3 6. 2 n 7. n ! 8. 29 30 5
2/4/2011 Execution on a computer that executes Example 1 billion instructions per second Consider f ( n ) = 6 ∙ 2 n + n 2 . n f(n) = n f(n) = log 2 n f(n)= nlog 2 n f(n) = n 2 f(n) = 2 n 10 0.01 μ s 0.003 μ s 0.033 μ s 0.1 μ s 1 μ s f ( n ) = 6 ∙ 2 n + n 2 ≤ 6 ∙ 2 n + 2 n = 7 ∙ 2 n , for all n ≥ 4 50 0.05 μ s 0.006 μ s 0.282 μ s 2.5 μ s 13 days 100 0.10 μ s 0.007 μ s 0.664 μ s 10 μ s 4x10 13 Therefore f ( n ) = O(2 n ) years 32 31 Limitations of Big- O Analysis • Its use is not appropriate for small amounts of input – For small amounts of input, use the simplest algorithm • The constant implied by the Big- O may be too large to be practical • Average-case analysis is almost always much more difficult than worst-case or best analysis to compute 33 6
Recommend
More recommend