100 likes | 216 Views
CS 155, Programming Paradigms Fall 2014, SJSU design techniques. Jeff Smith. Algorithm design techniques. Several classes of algorithms exist, based on that technique used for their design divide-and-conquer algorithms dynamic programming algorithms greedy algorithms
E N D
CS 155, Programming ParadigmsFall 2014, SJSUdesign techniques Jeff Smith
Algorithm design techniques • Several classes of algorithms exist, based on that technique used for their design • divide-and-conquer algorithms • dynamic programming algorithms • greedy algorithms • randomized algorithms • branch-and-bound (backtracking) algorithms
Divide & conquer algorithms • Relatively easy to design • Usually possible • especially when processing recursive structures • Relatively easy to prove correct • Analysis is often straightforward • e.g., by using the “Master Theorem” 4.1 of CLRS • Often rather efficient • especially for processing recursive structures • but sometimes subproblems recur frequently
Divide & conquer algorithms – good examples • Good examples • binary search • mergesort • Strassen’s matrix multiplication (CLRS, Sec. 4.2) • Other examples • selection sort, insertion sort • integer multiplication (analog of Strassen)
Dynamic programming algorithms • Often easy to design • same intuitions as divide and conquer • but usually implemented bottom up • solutions to subproblems are saved in a table • Usually possible • although not always better than divide & conquer • Analysis is often straightforward • time complexity is often that of constructing the table
Dynamic programming issues • Not always highly efficient • except in the case of repeated subproblems • when they can be more efficient than divide-and-conquer versions • e.g., Fibonacci numbers & binomial coefficients • Good examples • all-pairs shortest paths (CLRS 25.2) • optimal binary search trees (CLRS 15.5) • matrix chains (CLRS, Sec. 15.2)
Optimal substructure • A dynamic programming solution to a problem requires that the problem exhibit optimal substructure • A problem exhibits optimal substructure if an optimal solution to the problem contains within it optimal solutions to subproblems. • For such problems, we need only solve subproblems recursively • without worrying about how their solutions interact
Greedy algorithms • Often easy to design • Often aren't available (and also correct) • Often difficult to prove correct • Usually easy to analyze • Usually efficient • Good examples • Dijkstra's algorithm (single-source shortest paths) • Prim's and Kruskal's algorithms (spanning trees) • Huffman’s algorithm (minimum-length encoding)
Randomized algorithms • Not always easy to design • Not always helpful • Not always correct • and probability of correctness often hard to find • Average-case analysis is often awkward • this is generally the relevant metric • Often faster than competing algorithms • Good examples • quicksort (with randomly chosen pivot)
Backtracking, best-first, and branch-and-bound algorithms • Advantages • often easy to design • usually available • easy to prove correct • usually easy to analyze for worst case behavior • Why we’ll cover them only if time permits • often the average-case behavior is most relevant, but very difficult to analyze • often inefficient