algorithm efficiency sorting overview
play

Algorithm Efficiency & Sorting Overview Writing programs to - PowerPoint PPT Presentation

Algorithm Efficiency & Sorting Overview Writing programs to solve problem consists of a large Algorithm efficiency number of decisions Big-O notation how to represent aspects of the problem for solution Searching algorithms


  1. Algorithm Efficiency & Sorting Overview � Writing programs to solve problem consists of a large � Algorithm efficiency number of decisions � Big-O notation � how to represent aspects of the problem for solution � Searching algorithms � which of several approaches to a given solution component to use � Sorting algorithms � If several algorithms are available for solving a given problem, the developer must choose among them � If several ADTs can be used to represent a given set of problem data � which ADT should be used? � how will ADT choice affect algorithm choice? EECS 268 Programming II 1 EECS 268 Programming II 2 Overview � 2 Overview � 3 � If a given ADT (i.e. stack or queue) is attractive as � The order of importance is, generally, part of a solution � correctness � How will the ADT implemention affect the � efficiency program's: � clarity � correctness and performance? � Clarity of expression is qualitative and somewhat � Several goals must be balanced by a developer in dependent on perception by the reader producing a solution to a problem � developer salary costs dominate many software projects � correctness, clarity, and efficient use of computer � time efficiency of understanding code written by others resources to produce the best performance can thus have a significant monetary implication � How is solution performance best measured? � Focus of this chapter is execution efficiency � time and space � mostly, run-time (some times, memory space) EECS 268 Programming II 3 EECS 268 Programming II 4

  2. Measuring Algorithmic Efficiency Analyzing Algorithmic Cost � Analysis of algorithms � provides tools for contrasting the efficiency of different methods of solution � Comparison of algorithms � should focus on significant differences in efficiency � should not consider reductions in computing costs due to clever coding tricks � Difficult to compare programs instead of algorithms � how are the algorithms coded? � what computer should you use? � what data should the programs use? EECS 268 Programming II 5 EECS 268 Programming II 6 Analyzing Algorithmic Cost � 2 Analyzing Algorithmic Cost � 3 � Do not attempt to accumulate a precise prediction for program execution time, because � far too many complicating factors: compiler instructions output, variation with specific data sets, target hardware speed � Provide an approximation, an order of magnitude estimate, that permits fair comparison of one algorithm's behavior against that of another EECS 268 Programming II 7 EECS 268 Programming II 8

  3. Analyzing Algorithmic Cost � 4 Analyzing Algorithmic Cost � 5 � Various behavior bounds are of interest � Complexity measures can be calculated in terms of � best case, average case, worst case � T(n): time complexity and S(n): space complexity � Worst-case analysis � Basic model of computation used � A determination of the maximum amount of time that � sequential computer (one statement at a time) an algorithm requires to solve problems of size n � all data require same amount of storage in memory � Average-case analysis � each datum in memory can be accessed in constant time � A determination of the average amount of time that � each basic operation can be executed in constant time an algorithm requires to solve problems of size n � Note that all of these assumptions are incorrect! � Best-case analysis � good for this purpose � A determination of the minimum amount of time that � Calculations we want are order of magnitude an algorithm requires to solve problems of size n EECS 268 Programming II 9 EECS 268 Programming II 10 Example � Linked List Traversal Example � Sequential Search Node *cur = head; // assignment op � Number of comparisons � Assumptions while (cur != NULL) // comparisons op T B (n) = 1 Seq_Search(A: array, key: integer); C 1 = cost of assign. ����������������� i = 1; << endl; // write op T w (n) = n C 2 = cost of compare ������������������������� // assignment op ����������������������������� T A (n) = (n+1)/2 C 3 = cost of write } i = i + 1 � In general, what � Consider the number of operations for n items endwhile; developers worry about �������� T(n) = (n+1)C 1 + (n+1)C 2 + nC 3 the most is that this is then return(i) = (C 1 +C 2 +C 3 )n + (C 1 +C 2 ) = K 1 n + K 2 O(n) algorithm else return(0) endif; � Says, algorithm is of linear complexity � more precise analysis is end Sequential_Search; nice but rarely influences � work done grows linearly with n but also involves algorithmic decision constants EECS 268 Programming II 11 EECS 268 Programming II 12

  4. � ���������������������������������������� Bounding Functions Asymptotic Upper Bound EECS 268 Programming II 13 EECS 268 Programming II 14 Asymptotic Upper Bound � 2 Algorithm Growth Rates measured as a function of the problem size � Number of nodes in a linked list � Size of an array � Number of items in a stack � Number of disks in the Towers of Hanoi problem EECS 268 Programming II 15 EECS 268 Programming II 16

  5. � ����������������������������������������������������� � ����������������������������������������������������� Algorithm Growth Rates � 2 Algorithm Growth Rates � 3 algorithm with another � Example � if, algorithm A requires time proportional to n 2, and algorithm B requires time proportional to n � algorithm B is faster than algorithm A � n 2 and n are growth-rate functions � Algorithm A is O( n 2 ) - order n 2 � Algorithm B is O( n ) - order n � Growth-rate function f(n) � Algorithm A requires time proportional to n 2 � Algorithm B requires time proportional to n order in terms of the size of the problem 17 EECS 268 Programming II 18 Order-of-Magnitude Analysis and Big Order-of-Magnitude Analysis and Big O Notation O Notation Figure 9-3a A comparison of growth-rate functions: (a) in tabular form Figure 9-3b A comparison of growth-rate functions: (b) in graphical form EECS 268 Programming II 19 EECS 268 Programming II 20

  6. � ������������������������������������������������ Order-of-Magnitude Analysis and Big Keeping Your Perspective O Notation � Order of growth of some common functions � Only significant differences in efficiency are interesting � O(C) < O(log(n)) < O(n) < O(n * log(n)) < O(n 2 ) < O(n 3 ) < O(2 n ) < O(3 n ) < O(n!) < O(n n ) � Frequency of operations � Properties of growth-rate functions � O(n 3 + 3n) is O(n 3 ): ignore low-order terms how frequently particular ADT operations occur in a given application � O(5 f(n)) = O(f(n)): ignore multiplicative constant � however, some seldom-used but critical in the high-order term operations must be efficient � O(f(n)) + O(g(n)) = O(f(n) + g(n)) EECS 268 Programming II 21 EECS 268 Programming II 22 Keeping Your Perspective Sequential Search � Sequential search � If the problem size is always small, you can � look at each item in the data collection in turn ����������������������������������������� � stop when the desired item is found, or the end of the data is reached � order-of-magnitude analysis focuses on large problems int search(const int a[ ], int number_used, int target) { int index = 0; bool found = false; � Weigh the trade- ���������������������������� while ((!found) && (index < number_used)) { time requirements and its memory if (target == a[index]) requirements found = true; else � Compare algorithms for both style and Index++; efficiency } if (found) return index; else return -1; } EECS 268 Programming II 23 EECS 268 Programming II 24

Recommend


More recommend