CS 270 Algorithms Oliver Kullmann Week 2 Growth of Functions Divide-and- Divide and Conquer Conquer Min-Max- Problem Tutorial Growth of Functions 1 Divide-and-Conquer 2 Min-Max-Problem Tutorial 3
CS 270 General remarks Algorithms Oliver Kullmann Growth of Functions First we consider an important tool for the analysis of Divide-and- Conquer algorithms: Big-Oh . Min-Max- Problem Then we introduce an important algorithmic paradigm: Tutorial Divide-and-Conquer . We conclude by presenting and analysing two examples. Reading from CLRS for week 2 Chapter 2.3 Chapter 3
CS 270 Growth of Functions Algorithms Oliver Kullmann Growth of A way to describe behaviour of functions in the limit. We Functions Divide-and- are studying asymptotic efficiency. Conquer Min-Max- Describe growth of functions. Problem Tutorial Focus on what’s important by abstracting away low-order terms and constant factors. How we indicate running times of algorithms. A way to compare “sizes” of functions: O corresponds to ≤ Ω corresponds to ≥ Θ corresponds to = ≥ 0 . We consider only functions f , g : N → R
CS 270 O -Notation Algorithms Oliver Kullmann � � O g ( n ) is the set of all functions f ( n ) for which there are positive constants c and n 0 such that Growth of Functions Divide-and- f ( n ) ≤ cg ( n ) for all n ≥ n 0 . Conquer Min-Max- Problem Tutorial cg ( n ) f ( n ) n n 0 g ( n ) is an asymptotic upper bound for f ( n ). If f ( n ) ∈ O ( g ( n )), we write f ( n ) = O ( g ( n )) (we will precisely explain this soon)
CS 270 O -Notation Examples Algorithms Oliver Kullmann 2 n 2 = O ( n 3 ), with c = 1 and n 0 = 2. Growth of Example of functions in O ( n 2 ): Functions Divide-and- Conquer n 2 Min-Max- Problem n 2 + n Tutorial n 2 + 1000 n 1000 n 2 + 1000 n Also n n / 1000 n 1 . 999999 n 2 / lg lg lg n
CS 270 Ω-Notation Algorithms Oliver Kullmann � � Ω g ( n ) is the set of all functions f ( n ) for which there are Growth of Functions positive constants c and n 0 such that Divide-and- Conquer f ( n ) ≥ cg ( n ) for all n ≥ n 0 . Min-Max- Problem Tutorial f ( n ) cg ( n ) n n 0 g ( n ) is an asymptotic lower bound for f ( n ).
CS 270 Ω-Notation Examples Algorithms Oliver √ n = Ω(lg n ), with c = 1 and n 0 = 16. Kullmann Growth of Example of functions in Ω( n 2 ): Functions Divide-and- n 2 Conquer Min-Max- n 2 + n Problem Tutorial n 2 − n 1000 n 2 + 1000 n 1000 n 2 − 1000 n Also n 3 n 2 . 0000001 n 2 lg lg lg n 2 2 n
CS 270 Θ-Notation Algorithms Oliver Kullmann � � Θ g ( n ) is the set of all functions f ( n ) for which there are Growth of Functions positive constants c 1 , c 2 and n 0 such that Divide-and- Conquer c 1 g ( n ) ≤ f ( n ) ≤ c 2 g ( n ) for all n ≥ n 0 . Min-Max- Problem Tutorial c 2 g ( n ) f ( n ) c 1 g ( n ) n n 0 g ( n ) is an asymptotic tight bound for f ( n ).
CS 270 Θ-Notation (cont’d) Algorithms Oliver Kullmann Growth of Functions Divide-and- Examples 1 Conquer Min-Max- n 2 / 2 − 2 n = Θ( n 2 ), with c 1 = 1 4 , c 2 = 1 Problem 2 , and n 0 = 8. Tutorial Theorem 2 f ( n ) = Θ( g ( n )) if and only if f ( n ) = O ( g ( n )) and f ( n ) = Ω( g ( n )) . Leading constants and lower order terms do not matter.
CS 270 Asymptotic notation in equations Algorithms Oliver Kullmann When on right-hand side Growth of Functions Θ( n 2 ) stands for some anonymous function in the set Θ( n 2 ). Divide-and- Conquer 2 n 2 + 3 n + 1 = 2 n 2 + Θ( n ) means 2 n 2 + 3 n + 1 = 2 n 2 + f ( n ) Min-Max- Problem for some f ( n ) ∈ Θ( n ). In particular, f ( n ) = 3 n + 1. Tutorial When on left-hand side No matter how the anonymous functions are chosen on the left-hand side, there is a way to choose the anonymous functions on the right-hand side to make the equation valid. Interpret 2 n 2 + Θ( n ) = Θ( n 2 ) as meaning for all functions f ( n ) ∈ Θ( n ), there exists a function g ( n ) ∈ Θ( n 2 ) such that 2 n 2 + f ( n ) = g ( n ).
CS 270 Asymptotic notation chained together Algorithms Oliver Kullmann 2 n 2 + 3 n + 1 = 2 n 2 + Θ( n ) = Θ( n 2 ) Growth of Functions Divide-and- Interpretation: Conquer Min-Max- Problem First equation: There exists f ( n ) ∈ Θ( n ) such that Tutorial 2 n 2 + 3 n + 1 = 2 n 2 + f ( n ). Second equation: For all g ( n ) ∈ Θ( n ) (such as the f ( n ) used to make the first equation hold), there exists h ( n ) ∈ Θ( n 2 ) such that 2 n 2 + g ( n ) = h ( n ). Note What has been said of “Θ” on this and the previous slide also applies to “ O ” and “Ω”.
CS 270 Example Analysis Algorithms Oliver Kullmann Insertion-Sort ( A ) Growth of 1 for j = 2 to A . length Functions 2 key = A [ j ] Divide-and- Conquer 3 / / Insert A [ j ] into sorted sequence A [1 . . j − 1]. Min-Max- Problem 4 i = j − 1 Tutorial 5 while i > 0 and A [ i ] > key 6 A [ i +1] = A [ i ] 7 i = i − 1 8 A [ i +1] = key The for -loop on line 1 is executed O ( n ) times; and each statement costs constant time, except for the while -loop on lines 5-7 which costs O ( n ). O ( n 2 ) . Thus overall runtime is: O ( n ) × O ( n ) = Note: In fact, as seen last week, worst-case runtime is Θ( n 2 ).
CS 270 Divide-and-Conquer Approach Algorithms Oliver Kullmann There are many ways to design algorithms. Growth of Functions For example, insertion sort is incremental: having sorted Divide-and- A [1 . . j − 1], place A [ j ] correctly, so that A [1 . . j ] is sorted. Conquer Min-Max- Problem Tutorial Divide-and-Conquer is another common approach: Divide the problem into a number of subproblems that are smaller instances of the same problem. Conquer the subproblems by solving them recursively. Base case: If the subproblem are small enough, just solve them by brute force. Combine the subproblem solutions to give a solution to the original problem.
CS 270 Divide-and-Conquer Approach Algorithms Oliver Kullmann There are many ways to design algorithms. Growth of Functions For example, insertion sort is incremental: having sorted Divide-and- A [1 . . j − 1], place A [ j ] correctly, so that A [1 . . j ] is sorted. Conquer Min-Max- Problem Tutorial Divide-and-Conquer is another common approach: Divide the problem into a number of subproblems that are smaller instances of the same problem. Conquer the subproblems by solving them recursively. Base case: If the subproblem are small enough, just solve them by brute force. Combine the subproblem solutions to give a solution to the original problem.
CS 270 Naive Min-Max Algorithms Find minimum and maximum of a list A of n > 0 numbers. Oliver Kullmann Growth of Naive-Min-Max ( A ) Functions Divide-and- 1 least = A [1] Conquer Min-Max- 2 for i = 2 to A . length Problem 3 if A [ i ] < least Tutorial 4 least = A [ i ] 5 greatest = A [1] 6 for i = 2 to A . length 7 if A [ i ] > greatest 8 greatest = A [ i ] 9 return ( least , greatest ) The for -loop on line 2 makes n − 1 comparisons, as does the for -loop on line 6, making a total of 2 n − 2 comparisons. Can we do better? Yes!
CS 270 Naive Min-Max Algorithms Find minimum and maximum of a list A of n > 0 numbers. Oliver Kullmann Growth of Naive-Min-Max ( A ) Functions Divide-and- 1 least = A [1] Conquer Min-Max- 2 for i = 2 to A . length Problem 3 if A [ i ] < least Tutorial 4 least = A [ i ] 5 greatest = A [1] 6 for i = 2 to A . length 7 if A [ i ] > greatest 8 greatest = A [ i ] 9 return ( least , greatest ) The for -loop on line 2 makes n − 1 comparisons, as does the for -loop on line 6, making a total of 2 n − 2 comparisons. Can we do better? Yes!
CS 270 Naive Min-Max Algorithms Find minimum and maximum of a list A of n > 0 numbers. Oliver Kullmann Growth of Naive-Min-Max ( A ) Functions Divide-and- 1 least = A [1] Conquer Min-Max- 2 for i = 2 to A . length Problem 3 if A [ i ] < least Tutorial 4 least = A [ i ] 5 greatest = A [1] 6 for i = 2 to A . length 7 if A [ i ] > greatest 8 greatest = A [ i ] 9 return ( least , greatest ) The for -loop on line 2 makes n − 1 comparisons, as does the for -loop on line 6, making a total of 2 n − 2 comparisons. Can we do better? Yes!
CS 270 Divide-and-Conquer Min-Max Algorithms Oliver Kullmann As we are dealing with subproblems, we state each subproblem Growth of Functions as computing minimum and maximum of a subarray A [ p . . q ]. Divide-and- Initially, p = 1 and q = A . length , but these values change as we Conquer Min-Max- recurse through subproblems. Problem Tutorial To compute minimum and maximum of A [ p . . q ]: Divide by splitting into two subarrays A [ p . . r ] and A [ r +1 . . q ], where r is the halfway point of A [ p . . q ]. Conquer by recursively computing minimum and maximum of the two subarrays A [ p . . r ] and A [ r +1 . . q ]. Combine by computing the overall minimum as the min of the two recursively computed minima, similar for the overall maximum.
Recommend
More recommend