1 / 30

CS 312: Algorithm Analysis

This work is licensed under a Creative Commons Attribution-Share Alike 3.0 Unported License . CS 312: Algorithm Analysis. Lecture #22: Intro. to Dynamic Programming. Slides by: Eric Ringger, with contributions from Mike Jones, Eric Mercer, Sean Warnick. Objectives.

herb
Download Presentation

CS 312: Algorithm Analysis

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. This work is licensed under a Creative Commons Attribution-Share Alike 3.0 Unported License. CS 312: Algorithm Analysis Lecture #22: Intro. to Dynamic Programming Slides by: Eric Ringger, with contributions from Mike Jones, Eric Mercer, Sean Warnick

  2. Objectives • Motivate Dynamic Programming (DP) • Understand the tabular approach to DP • Use DP to solve some example problems • Extract the composition of a solution from a DP table

  3. Families of Algorithms • Introductory • Importance of algorithm analysis, seen through RSA • Elegant Design Principles • Divide-and-Conquer • Graph Exploration • Greedy • Drawback: only usable on very specific types of problems • Need: “Sledgehammers of thealgorithms craft” • Dynamic Programming • Linear Programming

  4. Divide & Conquer A B C B E F G E G H E F G

  5. A B C B E F G E G H E F G Dynamic Programming start solving sub-problems at the bottom.

  6. Dynamic Programming A E solutionE F solutionF G solutionG B C B E F G E G H E F G Build a table of results as you go

  7. Dynamic Programming A E solutionE F solutionF G solutionG B solutionB B C B E F G E G H E F G Avoid recomputing intermediate results;Consult the table

  8. Dynamic Programming A E solutionE F solutionF G solutionG B solutionB H solutionH B C B E F G E G H E F G Avoid recomputing intermediate results;Consult the table

  9. A B C B E F G E G H E F G Dynamic Programming A C B H E F G

  10. DP Key Idea #1 Not every sub-problem is new. Save time: retain prior results.

  11. DP Key Idea #2 • Devise a minimal description for any problem instance and sub-problem • Use this description as key to a table

  12. Dynamic Programming • Identify sub-problems • Define their minimal description • Perspective #1: Divide & Conquer with Memory Table • Solve the sub-problems one by one, smallest first • Store the solutions to sub-problems in a table and reuse the solutions to solve larger sub-problems • Until the top (original) problem instance is solved • Perspective #2: DAG • Nodes are sub-problems; edges are dependencies between the sub-problems • Linearize! • Solve the sub-problems one by one in the linearized order

  13. Example: Binomial Coefficients How many ways are there to choose k items from a set of n items?

  14. Example: C(5,3) C(5,3) C(4,2) C(4,3) C(3,1) C(3,2) C(3,2) C(3,3) C(2,0) C(2,1) C(2,1) C(2,2) C(2,1) C(2,2) 1 C(1,0) C(1,1) C(1,0) C(1,1) C(1,0) C(1,1) 1 1 1 1 1 1 1 1 1 (C(n,k))

  15. C(5,3): Be Wise C(5,3) C(4,2) C(4,3) C(3,1) C(3,2) C(3,3) C(2,0) C(2,1) C(2,2) C(1,0) C(1,1)

  16. From DAG to Table • What is the minimal description? • Can you embed this DAG in a table? C(5,3) C(4,2) C(4,3) C(3,1) C(3,2) C(3,3) C(2,0) C(2,1) C(2,2) C(1,0) C(1,1) Embed in a table; evaluate in the same order as before Linearize our DAG

  17. C(5, 3) k: n: Does this look familiar?

  18. Pascal’s Triangle Blaise Pascal (1623-1662) Second person to invent the calculator Religious philosopher Mathematician and physicist

  19. “Dynamic Programming” • Very little to do with writing code • Developed in the area of Operations Research • Conceived to optimally plan multi-stage processes • Then, “programming” = “planning” • Solves an optimization problem • Described by recurrence relations among problems and sub-problems • Possibly time-variant • i.e. a dynamic system • In the CS setting, often time-invariant • Coined by Bellman in the 1950s

  20. DP: From Problemto Table to Algorithm • Start with a problem definition • Devise a minimal description for any problem instance and sub-problem • Define recurrence to specify the relationship of problems to sub-problems • i.e., Define the conceptual DAG on sub-problems • Embed the DAG in a table • Use the index variables • 2-D case: index rows and columns • Two possible strategies • Fill in the sub-problem cells, proceeding from the smallest to the largest • Draw the DAG in the table from the top problem down to the smallest sub-problems; solve the relevant sub-problems in their table cells, from smallest to largest. i.e., solve top problem recursively, using the table as a memory function

  21. Making Change using DP • Given , what’s the smallest number of coins required to make change for ? • Example: • m = 4 different denominations of coins • Values d = [1, 2, 4, 7] – Look familiar? • In general, does the greedy algorithm always give the optimal answer? • Greedy can fail. • Example: • d = [1, 10, 18], = 20 • Goal: Solve every case using some other strategy • How about Dynamic Programming?

  22. Making Change using DP • Problem: Given , what’s the smallest number of coins to make change for? • Define sub-problem representation: • Let min. number of coins needed to make change for using coins of type 1 through . • Coins have denominations in the array ; assume • Basic ideas for relationship among sub-problems as a recurrence relation: • If , then take coins • If , either takea coin of value or do not take, and solve the problem (make change) for the balance • Take out a piece of scratch paper and write a mathematical version of the recurrence relation

  23. Making Change using DP • If , then take coins • If , either take a coin of value or do not

  24. Making Change using DP • Design the table to contain the sub-problem results from the recurrence relation • Design the DP algorithm to walk the table properly • Solve an example by running the DP algorithm • Modify the resulting algorithm for space and time, if necessary

  25. Given j, what’s the smallest number of “coins” to make j in change? Example

  26. Making Change j i How much space? How much time?

  27. Algorithm in Pseudo-code function coins(d, N) Input: Array d[1..n] specifies the coinage; N is the number of units for which to make change Output: Minimum number of coins needed to make change for N units using coins from d array c[1..n,0..N] for i=1 to n do c[i,0] = 0 for i=1 to n do for j=1 to N do if i=1 and j<d[i] then c[i,j] = +infinity else if i=1 then c[i,j] = 1+c[1,j-d[1]] else if j<d[i] then c[i,j] = c[i-1,j] else c[i,j] = min(c[i-1,j],1+c[i,j-d[i]]) return c[n,N] Easy translation from table + recurrence to pseudo-code!

  28. Pre-computing vs. on-the-fly • Eager: Pre-computing • Fill the table bottom-up, then extract solution • Lazy: On-demand • Build the DAG (in the table) top-down • Solve only the necessary table entries (again from bottom-up)

  29. Comparison • How does DP algorithm compare to greedy? • speed • space • correctness • simplicity

  30. Assignment • HW #15, #16 • Greedy: Divisible Knapsack • Dynamic Programming: (coin problem) • Project #5 • Gene Sequence Alignment (DP)

More Related