100 likes | 235 Views
CS 155, Programming Paradigms Fall 2014, SJSU divide-and-conquer. Jeff Smith. Divide-and-conquer and the Master Theorem. Recall that the Master Theorem gives the asymptotic time complexity of many divide-and-conquer algorithms once a recurrence is available
E N D
CS 155, Programming ParadigmsFall 2014, SJSUdivide-and-conquer Jeff Smith
Divide-and-conquer and the Master Theorem • Recall that the Master Theorem gives the asymptotic time complexity of many divide-and-conquer algorithms • once a recurrence is available • this theorem is Theorem 4.1, p. 94, of CLRS • Several of our divide-and-conquer examples will need to go beyond the Master Theorem • but it will still help to see what the Master Theorem has to say
Searching and sorting examples • For binary search (worst case), • the recurrence is T(n) = T(n/2) + c • here a=1 and b=2, so case 2 applies • and T(n) is Θ(log n) • For mergesort, • the recurrence is T(n) = 2T(n/2) + cn • here a=b=2, so case 2 applies • and T(n) is Θ(n log n)
Matrix multiplication • For the matrix multiplication algorithm of pp. 76ff, Section 4.2, CLRS, • the recurrence is T(n) = 8T(n/2) + cn2 • here a=8 and b=2, so case 1 applies • and T(n) is Θ(nlog28), or Θ(n3) • For Strassen’s version of the algorithm, 8 is replaced by 7, so • a=7 and T(n) is Θ(nlog27) • so Strassen’s version is asymptotically faster
Iterative deepening • Iterative deepening is a search technique commonly used in AI. • It involves repeated traversals of a binary tree to a maximum depth that's one level deeper at each iteration. • Although some tree nodes are visited multiple times, the number of visits is linear in the number of nodes.
Analysis of iterative deepening • To find T(n) in a simple case, assume • n nodes are visited, • each nonleaf has 2 children, and • T(n) is the total number of visits. • Then T(n) = T(n/2) + n, since • there are n/2 nodes above the lowest level, • n nodes are visited in the last iteration, and • T(n/2) nodes are visited before this iteration • By the Master Theorem, T(n) is Θ(n).
Beyond the Master Theorem • Useful strategies and CS 146 applications • recursion trees (CLRS, Sec. 4.4; Smith pp. 182-3) • mergesort’s behavior is described by a merge tree • its height is Θ(log n) and each level takes time Θ(n) • see my text, pp. 182-3, for an example • or just repeated substitution, e.g. for heapsort • T(n) ≤ T(n-1) + c lg n ≤ T(n-2) + c [lg n + lg (n-1)] • eventually T(n) ≤ cΣ (lg k) -- O(n lg n) by integral test • guessing & verifying a solution (my Sec 2.16) • CLRS, Sec 4.3, calls this substitution • e.g., for quicksort, guess that it’s O(n log n) on average
Beyond the Master Theorem, pt. 2 • CS 155 applications in need of strategies • simultaneous max & min • AVL tree height • internal path length of a binary tree • helpful for analyzing heap operations • exact solutions (vs. asymptotic solutions) • The second and third of these are described on a later slide • We begin discussion of the others below • and continue on the later slide
Exact vs. asymptotic solutions • Consider the problem of finding both the maximum and minimum of a set of numbers. • A naïve algorithm for finding the maximum uses n-1 comparisons • Similarly, n-1 comparisons are needed to find the minimum, so 2n-2 total comparisons are enough • Any algorithm for the problem takes Ω(n) time, since it must look at all the numbers. • So the best algorithm takes time Θ(n) • but it may not need the full 2n-2 comparisons
Faster algorithms for simultaneous max/min • Divide-and-conquer algorithms exist that use fewer than 2n-2 comparisons. • It’s tempting to repeatedly divide the n items into two subsets of equal size • but n may not be a power or 2 • and since we want an exact solution, we can’t round n to be a power of 2 by using dummy items • So we use subsets of size 2 and n-2 • and then the corresponding recurrence relation is T(n) = T(2) + T(n-2) + 2; T(1) = 0; T(2) = 1