dynamic programming cse 417 algorithms and
play

Dynamic Programming CSE 417: Algorithms and Outline: - PowerPoint PPT Presentation

Dynamic Programming CSE 417: Algorithms and Outline: Computational Complexity General Principles Easy Examples Fibonacci, Licking Stamps Meatier examples Winter 2009 RNA Structure prediction W. L. Ruzzo


  1. Dynamic Programming � CSE 417: Algorithms and Outline: � Computational Complexity � General Principles � Easy Examples – Fibonacci, Licking Stamps � Meatier examples � Winter 2009 � RNA Structure prediction � W. L. Ruzzo � Weighted interval scheduling � Dynamic Programming, I: � Maybe others � Fibonacci & Stamps � 1 � 2 � Some Algorithm Design Some Algorithm Design Techniques, I � Techniques, II � General overall idea � Divide & Conquer � Reduce solving a problem to a smaller problem or Reduce problem to one or more sub-problems of the problems of the same type � same type � Greedy algorithms � Typically, each sub-problem is at most a constant fraction Used when one needs to build something a piece at a of the size of the original problem � time � e.g. Mergesort, Binary Search, Strassen’s Algorithm, Quicksort Repeatedly make the greedy choice - the one that looks (kind of) � the best right away � e.g. closest pair in TSP search � Usually fast if they work (but often don't) � 3 � 4 �

  2. Some Algorithm Design “Dynamic Programming” � Techniques, III � Dynamic Programming � Program — A plan or procedure for dealing with some matter � Give a solution of a problem using smaller sub- � � � � – Webster’s New World Dictionary � problems, e.g. a recursive solution � Useful when the same sub-problems show up again and again in the solution � 5 � 6 � Dynamic Programming History A very simple case: Bellman. Pioneered the systematic study of dynamic programming in Computing Fibonacci Numbers � the 1950s. Etymology. Recall F n = F n-1 + F n-2 and F 0 = 0, F 1 = 1 � � � Dynamic programming = planning over time. � � Secretary of Defense was hostile to mathematical research. � � Bellman sought an impressive name to avoid confrontation. Recursive algorithm: � – � "it's impossible to use dynamic in a pejorative sense" Fibo(n) � – � "something not even a Congressman could object to" � if n=0 then return(0) � else if n=1 then return(1) Reference: Bellman, R. E. Eye of the Hurricane, An Autobiography. � else return(Fibo(n-1)+Fibo(n-2)) � 8 � 7

  3. Call tree - start � Full call tree � F (6) � F (6) � F (5) � F (4) � F (5) � F (4) � F (4) � F (4) � F (2) � F (3) � F (3) � F (3) � F (2) � F (1) � F (3) � F (2) � F (3) � F (2) � F (2) � F (1) � F (1) � F (0) � 1 � F (1) � F (0) � 1 � 1 � 0 � F (2) � F (2) � F (1) � F (1) � F (1) � F (0) � F (1) � F (0) � 0 � 1 � 1 � 1 � 0 � 1 � F (1) � F (1) � 0 � F (0) � F (0) � 1 � 0 � 1 � 0 � 9 � 10 � Memo-ization (Caching) � Fibonacci - Memoized Version � Save all answers from earlier recursive calls � initialize: F[i] � undefined for all i � F[0] � 0 � Before recursive call, test to see if value has F[1] � 1 � already been computed � FiboMemo(n): � Dynamic Programming � � if(F[n] undefined) { � NOT memoized; instead, convert memoized alg � � F[n] � FiboMemo(n-2)+FiboMemo(n-1) � from a recursive one to an iterative one � (top-down � bottom-up) � � } � � return(F[n]) � 11 � 12 �

  4. Fibonacci - Dynamic Dynamic Programming � Programming Version � FiboDP(n): Useful when � � F[0] � 0 Same recursive sub-problems occur repeatedly � For this problem, � F[1] � 1 Parameters of these recursive calls anticipated � keeping only last � for i=2 to n do 2 entries instead The solution to whole problem can be solved of full array � F[i] � F[i-1]+F[i-2] without knowing the internal details of how the suffices, but about � end � the same speed � sub-problems are solved � � return(F[n]) � “principle of optimality” � 13 � 14 � Making change � Licking Stamps � Given: � Given: � Large supply of 1¢, 5¢, 10¢, 25¢, 50¢ coins � Large supply of 5¢, 4¢, and 1¢ stamps � An amount N � An amount N � Problem: choose fewest coins totaling N � Problem: choose fewest stamps totaling N � Cashier’s (greedy) algorithm works: � Give as many as possible of the next biggest �� denomination � 15 � 16 �

  5. How to Lick 27¢ � A Simple Algorithm � # of 5 ¢ � # of 4 ¢ � # of 1 ¢ � total � At most N stamps needed, etc. � number � stamps � stamps � stamps � for a = 0, …, N { � � for b = 0, …, N { � 5 � 0 � 2 � 7 � � � for c = 0, …, N { � � � � if (5a+4b+c == N && a+b+c is new min) � � � � � {retain (a,b,c);}}} � 4 � 1 � 3 � 8 � output retained triple; � 3 � 3 � 0 � 6 � Time: O(N 3 ) � (Not too hard to see some optimizations, but we’re after bigger fish…) � Morals: Greed doesn’t pay; success of “cashier’s alg” 17 � 18 � depends on coin denominations � Better Idea � New Idea: Recursion � � � 0 i = 0 Theorem: If last stamp in an opt sol has value 1 + M ( i � 5) i � 5 � � M ( i ) = min 1 + M ( i � 4) i � 4 v, then previous stamps are opt sol for N-v. � � � 1 + M ( i � 1) i � 1 Proof: if not, we could improve the solution 27 � for N by using opt for N-v. � Alg: for i = 1 to n: � � 22 � � 23 � � 26 � � � � 22 25 0 i = 0 17 18 21 18 19 22 21 where M(i) = min . � . � . � . � . � . � . � . � . � . � . � . � . � . � . � . � . � . � 1 + M ( i � 5) i � 5 � � . � . � . � . � . � . � . � . � . � M ( i ) = min number of stamps 1 + M ( i � 4) i � 4 � � totaling i¢ � 1 + M ( i � 1) i � 1 Time: > 3 N/ 5 19 � 20 �

  6. Another New Idea: � Finding How Many Stamps � Avoid Recomputation � Tabulate values of solved subproblems � i 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 Top-down: “memoization” � M(i) 0 1 2 3 1 1 2 3 2 Bottom up: � 0 i 0 � � = 1 M [ i 5 ] i 5 + � � � for i = 0, …, N do � � � � � � M [ i ] min � � = 1 M [ i 4 ] i 4 + � � � � 1+Min(3,1,3) = 2 � 1 M [ i 1 ] i 1 + � � Time: O(N) � 21 � 22 � Finding Which Stamps: � Trace-Back � Trace-Back � Way 1: tabulate all � i 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 add data structure storing back-pointers indicating which predecessor gave the min. (more space, maybe less time) � M(i) 0 1 2 3 1 1 2 3 2 Way 2: re-compute just what’s needed � 4¢ � TraceBack(i): � � if i == 0 then return; � 1 +Min(3, 1 ,3) = 2 � 0 i 0 � = � � for d in {1, 4, 5} do � 1 M [ i 5 ] i 5 + � � M [ i ] min � � = 1 M [ i 4 ] i 4 �� if M[i] == 1 + M[i - d] � + � � � � 1 M [ i 1 ] i 1 + � � � then break; � � print d; � � TraceBack(i - d); � 23 � 24 �

  7. Elements of Dynamic Complexity Note � Programming � What feature did we use? � O(N) is better than O(N 3 ) or O(3 N/5 ) � What should we look for to use again? � But still exponential in input size � (log N bits) � “Optimal Substructure” � � Optimal solution contains optimal subproblems � (E.g., miserable if N is 64 bits – c • 2 64 steps & 2 64 memory.) � � A non-example: min (number of stamps mod 2) � “Repeated Subproblems” � Note: can do in O(1) for 5¢, 4¢, and 1¢ but not in � The same subproblems arise in various ways � general. See “NP-Completeness” later. � 25 � 26 �

Recommend


More recommend