220 likes | 237 Views
This lecture covers greedy algorithms, divide & conquer, and dynamic programming, with examples and proofs of correctness. It also discusses the closest points problem and memoizing/dynamic programming.
E N D
CSE 326: Data StructuresLecture #24The Algorhythmics Steve Wolfman Winter Quarter 2000
Today’s Outline • Greedy • Divide & Conquer • Dynamic Programming • Randomized • Backtracking
Greedy Algorithms Repeat until problem is solved: • Measure options according to marginal value • Commit to maximum Greedy algorithms are normally fast and simple. Sometimes appropriate as a heuristic solution or to approximate the optimal solution.
Hill-Climbing Global Maximum Local Maximum
Greed in Action • Best First Search • A* Search • Huffman Encodings • Kruskal’s Algorithm • Dijkstra’s Algorithm • Prim’s Algorithm • Scheduling
Scheduling Problem • Given: • a group of tasks {T1, …, Tn} • each with a duration {d1, …, dn} • a single processor without interrupts • Select an order for the tasks that minimizes average completion time 1 1.5 0.5 0.3 2 1.4 T1 T2 T3 T4 T5 T6 Average time to completion:
Proof of Correctness Common technique for proving correctness of greedy algorithms is by contradiction: • assume there is a non-greedy best way • show that making that way more like greedy solution improves the supposedly best way: contradiction! Proof of correctness for greedy scheduling:
Divide & Conquer • Divide problem into multiple smaller parts • Solve smaller parts • Solve base cases directly • Otherwise, solve subproblems recursively • Merge solutions together (Conquer!) Often leads to elegant and simple recursive implementations.
Divide & Conquer in Action • Mergesort • Quicksort • buildHeap • buildTree • Closest points
Closest Points Problem • Given: • a group of points {(x1, y1), …, (xn, yn)} • Return the distance between the closest pair of points 0.75
Closest Points Algorithm Closest pair is: • closest pair on leftor • closest pair on rightor • closest pair spanning the middle runtime:
Closest Points Algorithm Refined Closest pair is: • closest pair on leftor • closest pair on rightor • closest pair in middle strip within one of each other vertically runtime:
Memoizing/Dynamic Programming • Define problem in terms of smaller subproblems • Solve and record solution for base cases • Build solutions for subproblems up from solutions to smaller subproblems Can improve runtime of divide & conquer algorithms that have shared subproblems with optimal substructure. Usually involves a table of subproblem solutions.
Dynamic Programming in Action • Sequence Alignment • Fibonacci numbers • All pairs shortest path • Optimal Binary Search Tree
Memoized “Divide” & Conquer int fib(int n) { static vector<int> fibs; if (n <= 1) return 1; if (fibs[n] == 0) fibs[n] = fib(n - 1) + fib(n - 2); return fibs[n]; } int fib(int n) { if (n <= 1) return 1; else return fib(n - 1) + fib(n - 2); } Fibonacci Numbers F(n) = F(n - 1) + F(n - 2) F(0) = 1 F(1) = 1
Optimal Binary Search Tree Problem • Given: • a set of words {w1, …, wn} • probabilities of each word’s occurrence {p1, …, pn} • Produce a Binary Search Tree which includes all the words and has the lowest expected cost: Expected cost = (where di is the depth of word i in the tree)
The Optimal BST Shared Subproblems Optimal Substructure Falcon Millenium Falcon And to Ever Millenium to Zaphod And to Ever First to Meet Can an optimal solution possibly have suboptimal subtrees?
Optimal BST Cost Let CLeft,Right be the cost of the optimal subtree between wLeftand wRight. Then, CLeft,Right is: Let’s maintain a 2-D table to store values of Ci,j
Optimal BST Algorithm …a …am …and …egg …if …the …two a… am… and… egg… if… the… two…
To Do • Finish Project IV!!!!!! • Read Chapter 10 (algorithmic techniques) • Work on final review (remember you can and should work with others)
Coming Up • Course Wrap-up • Project IV due (March 7th; that’s tomorrow!) • Final Exam (March 13th)