CSE101: Algorithm Design and Analysis Russell Impagliazzo Sanjoy Dasgupta Ragesh Jaiswal (Thanks for slides: Miles Jones) Week-06 Lecture 22: Divide and Conquer (Master Theorem)
Master Theorem • How do you solve a recurrence of the form 𝑈 𝑜 = 𝑏𝑈 𝑜 𝑐 + 𝑃 𝑜 ! We will use the master theorem.
Summation Lemma Consider the summation $ 𝑠 ! ! !"# It behaves differently for different values of 𝑠 .
Summation Lemma Consider the summation $ 𝑠 ! ! !"# It behaves differently for different values of 𝑠 . If 𝑠 < 1 then this sum converges. This means that the sum is bounded above by some constant 𝑑 . Therefore $ $ 𝑠 ! < 𝑑 𝑔𝑝𝑠 𝑏𝑚𝑚 𝑜 𝑡𝑝 ! 𝑠 ! ϵ 𝑃(1) 𝑗𝑔 𝑠 < 1, 𝑢ℎ𝑓𝑜 ! !"# !"#
Summation Lemma Consider the summation $ 𝑠 ! ! !"# It behaves differently for different values of 𝑠 . If 𝑠 = 1 then this sum is just summing 1 over and over n times. Therefore $ $ 𝑠 ! = ! 𝑗𝑔 𝑠 = 1, 𝑢ℎ𝑓𝑜 ! 1 = 𝑜 + 1 ϵ 𝑃(𝑜) !"# !"#
Summation Lemma Consider the summation $ 𝑠 ! ! !"# It behaves differently for different values of 𝑠 . If 𝑠 > 1 then this sum is exponential with base 𝑠 . $ $ 𝑠 𝑠 ! < 𝑑𝑠 $ 𝑔𝑝𝑠 𝑏𝑚𝑚 𝑜, 𝑠 ! ϵ 𝑃 𝑠 $ 𝑗𝑔 𝑠 > 1, 𝑢ℎ𝑓𝑜 ! 𝑡𝑝 ! 𝑑 > 𝑠 − 1 !"# !"#
Summation Lemma Consider the summation $ 𝑠 ! ! !"# It behaves differently for different values of 𝑠 . 𝑃 1 𝑗𝑔 𝑠 < 1 $ 𝑠 ! ϵ 9 ! 𝑃 𝑜 𝑗𝑔 𝑠 = 1 𝑃 𝑠 $ 𝑗𝑔 𝑠 > 1 !"#
Master Theorem Master Theorem: If 𝑈(𝑜) = 𝑏𝑈(𝑜/𝑐) + 𝑃(𝑜 : ) for some constants 𝑏 > 0, 𝑐 > 1, 𝑒 ≥ 0 , Then 𝑃 𝑜 : 𝑗𝑔 𝑏 < 𝑐 : 𝑃 𝑜 : log 𝑜 𝑗𝑔 𝑏 = 𝑐 : 𝑈 𝑜 ϵ 𝑃 𝑜 ;<= ! > 𝑗𝑔 𝑏 > 𝑐 :
Master Theorem: Solving the recurrence 𝑈(𝑜) = 𝑏𝑈(𝑜/𝑐) + 𝑃(𝑜 : ) Size 𝑜 1 subproblem Size 𝑜/𝑐 𝑏 subproblems Size 𝑜/𝑐 % 𝑏 % subproblems Depth log & 𝑜 Size 1 𝑏 '() ! $ subproblems
Master Theorem: Solving the recurrence After 𝑙 levels, there are 𝑏 ! subproblems, each of size 𝑜/𝑐 ! . So, during the 𝑙 th level of recursion, the time complexity is * * $ 𝑏 ! = 𝑃 𝑏 ! $ 𝑃 & " & " 𝑏 ! = 𝑃 𝑜 * 𝑐 *
Master Theorem: Solving the recurrence After 𝑙 levels, there are 𝑏 ! subproblems, each of size 𝑜/𝑐 ! . * * $ 𝑏 ! = 𝑃 𝑏 ! $ So, during the 𝑙 th level, the time complexity is 𝑃 & " & " ! 𝑏 = 𝑃 𝑜 * 𝑐 * After log & 𝑜 levels, the subproblem size is reduced to 1, which usually is the size of the base case. So the entire algorithm is a sum of each level. '() ! $ 𝑏 ! 𝑜 * ! 𝑈 𝑜 = 𝑃 𝑐 * !"#
Master Theorem: Proof &'( ! ) 𝑏 # 𝑜 " & 𝑈 𝑜 = 𝑃 𝑐 " #$% Case 1: 𝑏 < 𝑐 " * + " < 1 and the series converges to a constant so Then we have that 𝑈 𝑜 = 𝑃 𝑜 "
Master Theorem: Proof &'( ! ) 𝑏 # 𝑜 " & 𝑈 𝑜 = 𝑃 𝑐 " #$% Case 2: 𝑏 = 𝑐 " * + " = 1 and so each term is equal to 1 Then we have that 𝑈 𝑜 = 𝑃 𝑜 " log + 𝑜
Master Theorem: Proof &'( ! ) 𝑏 # 𝑜 " & 𝑈 𝑜 = 𝑃 𝑐 " #$% Case 2: 𝑏 > 𝑐 " Then the summation is exponential and grows proportional to its last term &'( ! ) * so + " 𝑏 &'( ! ) 𝑈 𝑜 = 𝑃 𝑜 " = 𝑃 𝑜 &'( ! * 𝑐 "
Master Theorem Theorem: If 𝑈(𝑜) = 𝑏𝑈(𝑜/𝑐) + 𝑃(𝑜 : ) for some constants 𝑏 > 0, 𝑐 > 1, 𝑒 ≥ 0 , Then Top-heavy 𝑃 𝑜 : 𝑗𝑔 𝑏 < 𝑐 : 𝑃 𝑜 : log 𝑜 Steady-state 𝑗𝑔 𝑏 = 𝑐 : 𝑈 𝑜 ϵ Bottom-heavy 𝑃 𝑜 ;<= ! > 𝑗𝑔 𝑏 > 𝑐 :
Master Theorem Applied to Multiply 𝑃 𝑜 * 𝑗𝑔 𝑏 < 𝑐 * 𝑃 𝑜 * log 𝑜 𝑗𝑔 𝑏 = 𝑐 * 𝑈 𝑜 ϵ The recursion for the runtime of Multiply is 𝑃 𝑜 '() ! + 𝑗𝑔 𝑏 > 𝑐 * T(n) = 4T(n/2) + cn So we have that a=4, b=2, and d=1. In this case, 𝑏 > 𝑐 : so 𝑈 𝑜 ϵ𝑃 𝑜 ;<= , F = 𝑃 𝑜 G Not any improvement of grade-school method.
Master Theorem Applied to MultiplyKS 𝑃 𝑜 * 𝑗𝑔 𝑏 < 𝑐 * 𝑃 𝑜 * log 𝑜 𝑗𝑔 𝑏 = 𝑐 * 𝑈 𝑜 ϵ The recursion for the runtime of Multiply is 𝑃 𝑜 '() ! + 𝑗𝑔 𝑏 > 𝑐 * T(n) = 3T(n/2) + cn So we have that a=3, b=2, and d=1. In this case, 𝑏 > 𝑐 : so 𝑈 𝑜 ϵ𝑃 𝑜 ;<= , H = 𝑃 𝑜 I.KL An improvement on grade-school method!!!!!!
Poll: What is the fastest known integer multiplication time? • 𝑃 𝑜 /012 3 ) • 𝑃(𝑜 𝑚𝑝𝑜 (log 𝑚𝑝𝑜) • 𝑃(𝑜 𝑚𝑝𝑜 2^{log ∗ 𝑜}) • 𝑃(𝑜 log 𝑜) • O(n)
Poll: What is the fastest known integer multiplication time? All have/will be correct • 𝑃 𝑜 /012 Kuratsuba • 𝑃(𝑜 𝑚𝑝𝑜 log log 𝑜 ) Schonhage-Strassen, 1971 • 𝑃(𝑜 𝑚𝑝𝑜 2^{𝑑 log ∗ 𝑜}) Furer, 2007 • 𝑃(𝑜 log 𝑜) Harvey and van der Hoeven, 2019 • O(n), you, tomorrow?
Can we do better than 𝑜 !.#$ ? • Could any multiplication algorithm have a faster asymptotic runtime than 𝛪 𝑜 5.78 ? • Any ideas?????
Can we do better than 𝑜 !.#$ ? • What if instead of splitting the number in half, we split it into thirds. • x= x L x M x R • y= y L y M y R
Can we do better than 𝑜 !.#$ ? • What if instead of splitting the number in half, we split it into thirds. • 𝑦 = 2 39/2 𝑦 ; + 2 9/2 𝑦 < + 𝑦 = • 𝑧 = 2 39/2 𝑧 ; + 2 9/2 𝑧 < + 𝑧 =
Multiplying trinomials • 𝑏𝑦 3 + 𝑐𝑦 + 𝑑 𝑒𝑦 3 + 𝑓𝑦 + 𝑔
Multiplying trinomials • 𝑏𝑦 3 + 𝑐𝑦 + 𝑑 𝑒𝑦 3 + 𝑓𝑦 + 𝑔 = 𝑏𝑒𝑦 > + 𝑏𝑓 + 𝑐𝑒 𝑦 2 + 𝑏𝑔 + 𝑐𝑓 + 𝑑𝑒 𝑦 3 + 𝑐𝑔 + 𝑑𝑓 𝑦 + 𝑑𝑔 9 multiplications means 9 recursive calls. Each multiplication is 1/3 the size of the original.
Multiplying trinomials • 𝑏𝑦 G + 𝑐𝑦 + 𝑑 𝑒𝑦 G + 𝑓𝑦 + 𝑔 = 𝑏𝑒𝑦 F + 𝑏𝑓 + 𝑐𝑒 𝑦 H + 𝑏𝑔 + 𝑐𝑓 + 𝑑𝑒 𝑦 G + 𝑐𝑔 + 𝑑𝑓 𝑦 + 𝑑𝑔 9 multiplications means 9 recursive calls. Each multiplication is 1/3 the size of the original. 𝑈 𝑜 = 9𝑈 𝑜 3 + 𝑃(𝑜)
Multiplying trinomials • 𝑏𝑦 G + 𝑐𝑦 + 𝑑 𝑒𝑦 G + 𝑓𝑦 + 𝑔 = 𝑏𝑒𝑦 F + 𝑏𝑓 + 𝑐𝑒 𝑦 H + 𝑏𝑔 + 𝑐𝑓 + 𝑑𝑒 𝑦 G + 𝑐𝑔 + 𝑑𝑓 𝑦 + 𝑑𝑔 𝑈 𝑜 = 9𝑈 𝑜 3 + 𝑃(𝑜) 9 > 3 ! a=9 𝑃 𝑜 * 𝑗𝑔 𝑏 < 𝑐 * 𝑈 𝑜 = 𝑃 𝑜 "#$ M % b=3 𝑃 𝑜 * log 𝑜 𝑗𝑔 𝑏 = 𝑐 * 𝑈 𝑜 ϵ 𝑈 𝑜 = 𝑃 𝑜 & d=1 𝑃 𝑜 '() ! + 𝑗𝑔 𝑏 > 𝑐 *
Multiplying trinomials • 𝑏𝑦 G + 𝑐𝑦 + 𝑑 𝑒𝑦 G + 𝑓𝑦 + 𝑔 = 𝑏𝑒𝑦 F + 𝑏𝑓 + 𝑐𝑒 𝑦 H + 𝑏𝑔 + 𝑐𝑓 + 𝑑𝑒 𝑦 G + 𝑐𝑔 + 𝑑𝑓 𝑦 + 𝑑𝑔 • There is a way to reduce from 9 multiplications down to just 5!!! • Then the recursion becomes • 𝑈 𝑜 = 5𝑈(𝑜/3 ) + O(n) • So by the master theorem
Multiplying trinomials • 𝑏𝑦 G + 𝑐𝑦 + 𝑑 𝑒𝑦 G + 𝑓𝑦 + 𝑔 = 𝑏𝑒𝑦 F + 𝑏𝑓 + 𝑐𝑒 𝑦 H + 𝑏𝑔 + 𝑐𝑓 + 𝑑𝑒 𝑦 G + 𝑐𝑔 + 𝑑𝑓 𝑦 + 𝑑𝑔 • There is a way to reduce from 9 multiplications down to just 5!!! • Then the recursion becomes • 𝑈 𝑜 = 5𝑈(𝑜/3 ) + O(n) • So by the master theorem T(n)=O( 𝑜 ;<= - K ) = 𝑃 𝑜 I.FH
Dividing into k subproblems • What happens if we divide into k subproblems each of size n/k. • (𝑏 #./ 𝑦 #./ + 𝑏 #.0 𝑦 #.0 + ⋯ 𝑏 / 𝑦 + 𝑏 % )(𝑐 #./ 𝑦 #./ + 𝑐 #.0 𝑦 #.0 + ⋯ 𝑐 / 𝑦 + 𝑐 % ) • How many terms are there? (multiplications.)
Dividing into k subproblems • What happens if we divide into k subproblems each of size n/k. • (𝑏 !,- 𝑦 !,- + 𝑏 !,% 𝑦 !,% + ⋯ 𝑏 - 𝑦 + 𝑏 # )(𝑐 !,- 𝑦 !,- + 𝑐 !,% 𝑦 !,% + ⋯ 𝑐 - 𝑦 + 𝑐 # ) • How many terms are there? (multiplications.) • There are 𝑙 G multiplications. The recursion is 𝑈 𝑜 = 𝑙 G 𝑈 𝑜 𝑙 + 𝑃 𝑜 … … … 𝑏 = 𝑙 G , 𝑐 = 𝑙, 𝑒 = 1 𝑈 𝑜 = 𝑃(𝑜 ;<= 1 N , ) = 𝑃 𝑜 G
Cook-Toom algorithm • In fact, if you split up your number into k equally sized parts, then you can combine them with 2k-1 multiplications instead of the 𝑙 3 individual multiplications. • This means that you can get an algorithm that runs in • 𝑈 𝑜 = (2𝑙 − 1)𝑈(𝑜/𝑙 ) + O(n)
Cook-Toom algorithm • In fact, if you split up your number into k equally sized parts, then you can combine them with 2k-1 multiplications instead of the 𝑙 G individual multiplications. • This means that you can get an algorithm that runs in • 𝑈 𝑜 = (2𝑙 − 1)𝑈(𝑜/𝑙 ) + O(n) 234(,167) • 𝑈 𝑜 = 𝑃 𝑜 time!!!! 234 1
Recommend
More recommend