14 1
play

14-1 Program Efficiency & Resources Example Goal: Find way to - PDF document

GREAT IDEAS IN COMPUTER CSC 143 Java SCIENCE Program Efficiency & ANALYSIS OF ALGORITHMIC COMPLEXITY Introduction to Complexity Theory 1 2 Overview Comparing Algorithms Topics Example: Weve seen two different list


  1. GREAT IDEAS IN COMPUTER CSC 143 Java SCIENCE Program Efficiency & ANALYSIS OF ALGORITHMIC COMPLEXITY Introduction to Complexity Theory 1 2 Overview Comparing Algorithms • Topics • Example: We’ve seen two different list implementations • Measuring time and space used by algorithms • Dynamic expanding array • Machine-independent measurements • Linked list • Costs of operations • Which is “better”? • Comparing algorithms • How do we measure? • Asymptotic complexity – O( ) notation and complexity classes • Stopwatch? Why or why not? 3 4 14-1

  2. Program Efficiency & Resources Example • Goal: Find way to measure "resource" usage in a way that is • What is the running time of the following method? independent of particular machines/implementations // Return the maximum value of the elements of an array of doubles public double max(double[ ] a) { • Resources double max = a[0]; // why not max = 0? • Execution time for ( int k = 1; k < a.length; k++) { • Execution space if (a[k] > max) { • Network bandwidth max = a[k]; } • others } • We will focus on execution time return max; • Basic techniques/vocabulary apply to other resource measures } • How do we analyze this? 5 6 Analysis of Execution Time Cost of operations: Constant Time Ops 1. First: describe the size of the problem in terms of one or • Constant-time operations: each take one abstract time “step” more parameters • Simple variable declaration/initialization (double x = 0.0;) • For max, size of array makes sense • Assignment of numeric or reference values (var = value;) • Often size of data structure, but can be magnitude of some • Arithmetic operation (+, -, *, /, %) numeric parameter, etc. • Array subscripting (a[index]) 2. Then, count the number of steps needed as a function of • Simple conditional tests (x < y, p != null) the problem size • Operator new itself (not including constructor cost) • Need to define what a "step" is. Note: new takes significantly longer than simple arithmetic or assignment, but its cost is independent of the problem we’re trying to analyze • First approximation: one simple statement • Note: watch out for things like method calls or constructor • More complex statements will be multiple steps invocations that look simple, but are expensive 7 8 14-2

  3. Cost of operations: Zero-time Ops Sequences of Statements • Compiler can sometimes pay the whole cost of setting up • Cost of operations S1; S2; … Sn • Nothing left to do at runtime is sum of the costs of S1 + S2 + … + Sn • Variable declarations without initialization double[ ] overdrafts; • Variable declarations with compile-time constant initializers static final int maxButtons = 3; • Casts (of reference types, at least) ... (Double) checkBalance 9 10 Conditional Statements Analyzing Loops • The two branches of an if-statement might take different times. What • Basic analysis to do?? 1. Calculate cost of each iteration if (condition) { 2. Calculate number of iterations S1; } else { 3. Total cost is the product of these S2; Caution -- sometimes need to add up the costs differently if cost of each iteration } is not roughly the same • Hint: Depends on analysis goals • Nested loops • "Worst case": the longest it could possibly take, under any circumstances • "Average case": the expected or average number of steps • Total cost is number of iterations or the outer loop times the cost • "Best case": the shortest possible number of steps, under some special of the inner loop circumstance • same caution as above • Generally, worst case is most important to analyze 11 12 14-3

  4. Method Calls Exact Complexity Function • Cost for calling a function is cost of... • Careful analysis of an algorithm leads to an algebraic formula cost of evaluating the arguments (constant or non-constant) + cost of actually calling the function (constant overhead) • The "exact complexity function" gives the number of steps as a function of the problem size + cost of passing each parameter (normally constant time in Java for both numeric and reference values) • What can we do with it: + cost of executing the function body (constant or non-constant?) • Predict running time in a particular case (given n, given type of computer)? System.out.print(this.lineNumber); • Predict comparative running times for two different n (on same type System.out.println("Answer is " + Math.sqrt(3.14159)); of computer)? • ***** Get a general feel for the potential performance of an algorithm • Terminology note: "evaluating" and "passing" an argument • ***** Compare predicted running time of two different algorithms for the same problem (given same n) are two different things! 13 14 Exercise A Graph is Worth A Bunch of Words • Graphs are a good tool to illustrate, study, and compare • Analyze the running time of // print row r with length r of a // multiplication table printMultTable complexity functions void printRow( int r) { • Pick the problem size for ( int k = 0; k <= r; k++) { • Count the number of steps System.out.print( r * k + “ ”); } // print triangular multiplication table System.out.println( ); // with n rows } • Fun math review for you void printMultTable( int n) { for ( int k=0; k <=n; k++) { • How do you graph a function? printRow(k); • What are the shapes of some common functions? For example, } ones mentioned in these slides or the textbook. } 15 16 14-4

  5. Comparing Algorithms Orders of Growth • Examples: • Suppose we analyze two algorithms and get these times (numbers of steps): • Algorithm 1: 37n + 2n 2 + 120 N log 2 N 5N N log 2 N N 2 2 N =============================================================== • Algorithm 2: 50n + 42 8 3 40 24 64 256 How do we compare these? What really matters? 16 4 80 64 256 65536 • Answer: In the long run, the thing that is most interesting is ~10 9 32 5 160 160 1024 the cost as the problem size n gets large 64 6 320 384 4096 ~10 19 • What are the costs for n=10, n=100; n=1,000; n=1,000,000? 128 7 640 896 16384 ~10 38 • Computers are so fast that how long it takes to solve small problems ~10 76 256 8 1280 2048 65536 is rarely of interest 10000 13 50000 10 5 10 8 ~10 3010 17 18 Asymptotic Complexity Big-O Notation • Asymptotic: Behavior of complexity function as problem size • Definition: If f(n) and g(n) are two complexity functions, we gets large say that • Only thing that really matters is higher-order term f(n) = O(g(n)) ( pronounced f(n) is O(g(n)) or is order g(n) ) • Can drop low order terms and constants if there is a constant c such that • The asymptotic complexity gives us a (partial) way to answer f(n)  c • g(n) “which algorithm is more efficient” for all sufficiently large n • Algorithm 1: 37n + 2n 2 + 120 is proportional to n 2 • Algorithm 2: 50n + 42 is proportional to n • Graphs of functions are handy tool for comparing asymptotic behavior 19 20 14-5

  6. Exercises Implications • Prove that 5n+3 is O(n) • The notation f(n) = O(g(n)) is not an equality • Think of it as shorthand for • “f(n) grows at most like g(n)” or • “f grows no faster than g” or • “f is bounded by g” • Prove that 5n 2 + 42n + 17 is O(n 2 ) • O( ) notation is a worst-case analysis • Generally useful in practice • Sometimes want average-case or expected-time analysis if worst- case behavior is not typical (but often harder to analyze) 21 22 Complexity Classes Rule of Thumb • If the algorithm has polynomial time or better: practical • Several common complexity classes (problem size n) • typical pattern: examining all data, a fixed number of times • Constant time: O(k) or O(1) • If the algorithm has exponential time: impractical • Logarithmic time: O(log n) [Base doesn’t matter. Why?] • typical pattern: examine all combinations of data • Linear time: O(n) • What to do if the algorithm is exponential? • “n log n” time: O(n log n) • Try to find a different algorithm • Quadratic time: O(n 2 ) • Some problems can be proved not to have a polynomial solution • Cubic time: O(n 3 ) • Other problems don't have known polynomial solutions, despite … years of study and effort. • Exponential time: O(k n ) • Sometimes you settle for an approximation: • O(n k ) is often called polynomial time The correct answer most of the time, or An almost-correct answer all of the time 23 24 14-6

Recommend


More recommend