1 / 41

A little bit of recurrences here.. From CLRS. Dr. M. Sakalli, Marmara University

A little bit of recurrences here.. From CLRS. Dr. M. Sakalli, Marmara University Picture May 5 th 2006, RPI, NY. Plan for Analysis of Recursive Algorithms. Decide on a parameter indicating an input’s size . Identify the algorithm’s basic operation.

Download Presentation

A little bit of recurrences here.. From CLRS. Dr. M. Sakalli, Marmara University

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. A little bit of recurrences here.. From CLRS. Dr. M. Sakalli, Marmara University Picture May 5th 2006, RPI, NY

  2. Plan for Analysis of Recursive Algorithms • Decide on a parameter indicating an input’ssize. • Identify the algorithm’sbasic operation. • Check whether the number of times the basic op. is executed may vary on different inputs of the same size. (If it may, the worst, average, and best cases must be investigated separately.) • Set up arecurrence relation with an appropriate initial condition expressing the number of times the basic op. is executed. • Solve the recurrence (or, at the very least, establish its solution’s order of growth) by backward substitutions or another method.

  3. Three methods for solving recurrences -- obtaining θ or big-O bounds on T(n): • The substitution method: make a guess and then prove the guess is correct by induction. • The recursion-tree method: convert the recurrence into a tree of costs & sum them. • The "master method": gives bounds for recurrences of the form: T(n) = aT(n/b)+f(n) where a ≥ 1, b > 1, and f(n) is given positive funtion.

  4. Substitution method, two steps -Make a guess for the solution -Use inductive method to prove that the initial guess holds. -Check the valid initial conditions. To Determine upper bound 1- for a choice of c>0, prove that 2- f(n) ≤c g(n), for every n≥n0, T(n) ≤c n ln(n) (2) f(n) = O(g(n)) We say that “f(n) is big-O of g(n).” As n increases, f(n) grows no faster than g(n). In other words, g(n) is an asymptotic upper bound on f(n) for c>0, and n≥n0.

  5. Big- notation To Determine lower bound 1- for a choice of c>0, prove that 2- c g(n) ≤ f(n), for all n≥n0, c n ln(n) ≤ T(n) (2) We say that “f(n) is omega of g(n).” As n increases, f(n) grows no slower than g(n). In other words, g(n) is an asymptotic lower bound on f(n).

  6. Prove n2+ n = O(n3), or question would have been: Prove that f(n) is big-O of g(n)=n3. f(n) ≤ c g(n), for all n>n0. for n0=1, c≥2, for all n>n0, this gives n2+ n ≤ 2 n3, end of proof. or we could have chosen, for n0=2, c≥1..

  7. The same example: Determine upper bound of T(n) = 2T(n/2) + n (1) 1- for a choice of c, prove that 2- f(n) ≤c g(n), for every n≥n0, T(n) ≤c n ln(n) (2) This is the guess made g(n)=n ln(n), Hence start with by assuming that T(n) ≤cn lg(n) holds for f loor(n/2), i.e. that T(n/2) ≤c n/2 lg(n/2) Substitute this into the equation above: T(n) ≤ 2[c n/2 lg(n/2)] + n ≤cn lg(n/2) + n = c n lg(n) - cn lg(2) + n = cn lg(n) - cn + n ≤c n lg(n), this will hold if c≥ 1. (2) -To complete the inductive proof, determine c and n0 .. And show that inequality holds for the boundary conditions. -T(1)>0, but lg(1)=0, which seems doesn’t satisfy the equation. -Then work out between equations (1) and (2), obtain n=2, T(2) = 2T(1)+2 ≤ c 2 lg(2), T(3) =2T(1)+3<c 3 lg(3).. must -Determining c requires establishing the base cases for T(1).. - (2T(1)+2)/ (2 lg(2) )≤ c T(1)+1 < c, (2T(1)+3) /(3 lg(3))<c, - Chose the maximum of {(T(1)+1), (2T(1)+3) /(3 lg(3))}  T(1)+1, therefore for T(1) =1, c=2.

  8. Sometimes manipulation variables can make an unknown recurrence look like equation. For example, consider the recurrence: T(n) = 2T(√n) + lg n Let m = lg(n), then n = 2m: T(2m) = 2T(2m/2) + m And now, let S(m') = T(2m): S(m') = 2S(m'/2) + m' Solution!! S(m') = O(m' lg m'). Changing back to T(n), T(n) = T(2m) = S(m') = O(lg n lg lg n).

  9. Example 1: Substitution method: Recursive evaluation of n! Definition: n ! = 1  2  … (n-1) n for n ≥1 and 0! = 1 Recursive definition of n!: F(n) = F(n-1) n for n ≥1 and F(0) = 1 Size: n Basic operation: Multiplication Recurrence relation: M(n) = M(n-1) + 1, M(0) = 0 M(n) = M(n-1) + 1, n>0, M(0) = 0 M(n) = M(n-2) + 1 + 1 … M(n) = M(n-i) + i, i=n, M(n) =n..

  10. Example 3: Counting #bits A(2k) = A(2k-1)+1, A(20) = 0… A(2k) = A(2k-1)+1, = [A(2k-2)+1]+1. …. = A(2k-k)+k  n=2k k = log2(n)  (log(n))

  11. The recursion-tree method In a recursion tree, each node represents the cost of a recursive call of the algorithm somewhere in its recursive invocation, but ignoring the cost of further recursive calls from within it, which are accounted for by "child" costs of that node. The costs of the nodes are summed to get the total cost at each level and finally the sum of the per level costs... Recursion trees are especially useful for the T(n) of a divide-and-conquer algorithm, they are used to generate good guesses, which can then be verified by the substitution method. Hence, a bit of "sloppiness" to be tolerated. Find an upper bound for T(n) = 3T(n/4) + θ(n^2) Ignore the floor function and non-integer values of n/4, so that T(n) = 3T(n/4) + cn^2 for some c > 0. cn^2 / | \ / | \ / | \ T(n/4) T(n/4) T(n/4) Continue this process until reaching the boundary which is a subproblem of size 1 at depth i, n/4i = 1,  i = (lg(n))/2 =log4(n). Thus the tree has (lg(n)/2) + 1 levels: 0, 1, 2, ..., (lg(n))/2. Non-recursive cost at each level decreasing by a factor of 4 and three times as many of them as in the previous level.. So at each level the cost is c(n/4i)2and 3i of branches, c3i(n/4i)2 ,= (3/16)i cn2 …

  12. At the last level of depth log4(n), there are 3lg(n)/2 = 3log4(n) = nlog4(3) nodes of cost T(1) for a total cost of θ(nlog4(3)=nlog4(3) ). Thus the total cost of all levels is: 4.2.3 log4(n)-1 inf T(n) = Σ (3/16)i cn2 + θ(nlog4(3)) < Σ( (3/16)i cn2) + θ(nlog4(3)) i=0 i=0 = (1/(1 - 3/16)) cn2 + θ(nlog4(3)) = 16/13 cn2 + θ(nlog4(3)) = O(n2) Which gives the initial guess that would have been chosen at the fist hand. Now use the substitution method to verify the guess, T(n) = O(n2) as an upper bound T(n) = 3T(n/4) + θ(n2) Need to show T(n) ≤ dn2 for some d > 0. Letting c be as before, we have: T(n) ≤ 3T(n/4) + cn2 ≤ 3d(n/4)2 + cn2 by ind. hyp. ≤ 3d(n/4)2 + cn2 = (3/16)dn2 + cn2 ≤ dn2 Choose d ≥ (16/13)c

  13. logbn-1  j=0 • T(n) = (nlogba)+ Σj=0logbn-1ajf(n/bj)

  14. The master theorem Let a ≥ 1 and b > 1, f(n) asymptotically positive!!!, and let T(n) be defined by: T(n) = aT(n/b) + f(n) where n/b can be interpreted to be rounded either up or down. Then T(n) can be bounded asymptotically: • If f(n) = O(nlogb(a)-Є) for some Є> 0, then T(n) = θ(nlogb(a)) • If f(n) = θ(nlogb(a)), k≥0, then T(n) = θ(nlogb(a) logk+1(n) ) • If f(n) = Ω(nlogb(a)+Є) for some Є> 0, and if regularity condition, af(n/b) ≤ cf(n) for some c < 1 and for all n > n0 for some n0>0, and then T(n) = θ(f(n)). The regularity condition implies that if f(n), the driving cost is not a polynomial then a limited 4th condition allows considering polylog factors To cover the gap between 1-2 and 2-3, corollary below.. Corollary if f(n) is θ(nlogb(a) lgk(n)), then T(n) = θ(nlogb(a) lgk+1(n) ) For example T(n)= 2 T(n/2) + n log n. nlg2 =n vs (n lgn). Since f(n)=n lgn is asymptotically less than n1+ for any . This which makes the case 2.) a=2, b=2, non-polynomial f(n), however f(n)(n log n), k=1, therefore (n log2 n)!!

  15. If f(n)=nd no need to check regularity, just compare both side. Master Theorem: If T(n) = aT(n/b) + O(nd) for some constants a ≥ 1, b > 1, d ≥ 0, then (nlogba) if d < logba, (a > bd) T(n) = (ndlgn) if d = logba, (a = bd) (nd) if d > logba, (a < bd) Why? The proof uses a recursion tree argument. case 1: f(n) is "polynomially smaller than (nlogba) case 2: f(n) is "asymptotically equal to (nlogba) case 3: f(n) is "polynomially larger than (nlogba) Corollary case4: if f(n)=(nlogbalgkn), then T(n)=(nlogbalgk+1n). (as exercise)

  16. Exp1: T(n) = 9T(n/3) + n • f(n) = n, nlogba = n2 • compare f(n) = n with the cost of recursiveness.. n = O(n2 -) ( =1, so f(n) is polynomially smaller than nlogba ), case 1 will apply here T(n) = (n2) Exp2: T(n) = T(n/2) + 1 • f(n) = 1, nlogba = n0 = 1 1 = (n0) (driving cost f(n) is polynomial equal to the cost of recursiveness, nlogba ) • case 2 applies: T(n) = (n0lgn) = (lgn) Exp3: T(n) = T(n/2) + n2 • D(n) = n2, nlogba = nlog21 = n0 = 1 n2 = (n0+) (0<<=2, so D(n) is polynomially larger) • Since D(n) is a polynomial in n, case 3 applies T(n) = (n2) Exp4: T(n) = 4T(n/2) + n2 D(n) = n2, nlogba = nlg4 = n2 compare both sides, D(n) with n2 polynomially equal, case 2 holds, T(n) = (n2lgn)

  17. Exp5: T(n) = 7T(n/3) + n2 • f(n) = n2, nlogba = nlog37 = n1+  • compare f(n) = n2 with n1+ , n2 = (n1+) so f(n) is polynomially larger • Since f(n) is a polynomial in n, case 3 holds and T(n) = (n2) Exp6: T(n) = 7T(n/2) + n2 • nlogba = nlog27 = n2+  • compare f(n) = n2 with n2+  f(n) is polynomially smaller • Case 1 holds and T(n) = (nlog27) Exp7: T(n) = 3T(n/4) + n lgn • f(n) = nlgn, recursive side nlog43 = (n1- ) • comparing both sides, case 3 applies here, since f(n)=log(n), we need to check regularity condition for sufficiently large n • af(n/b) = 3(n/4)lg(n/4)  c n lg n = cf(n) for c=3/4 • T(n) = (n lg n) • Use recursion tree for a good guess and give a proof of your result by using substitution method for • T(n)=3T(n/4) + cn2 • T(n)=T(n/3) + T(2n/3) + cn

  18. Proof of Master Theorem, CLRS, Better to read from Chapter 4 of CLRS. The proof for the exact powers, n=bk for k1. Lemma 4.2 (1) if n=1 for T(n) = aT(n/b)+f(n) if n=bk for k1 where a 1, b>1, f(n) be a nonnegative function, Then T(n) = (nlogba)+ ajf(n/bj) Proof: By iterating the recurrence By recursion tree logbn-1  j=0

  19. Recursion tree for T(n)=aT(n/b)+f(n) 18

  20. Proof of Master Theorem (cont.) Lemma 4.3: Let a 1, b>1, f(n) be a nonnegative function defined on exact power of b, then g(n)= ajf(n/bj) can be bounded for exact power of b as: If f(n)=O(nlogba-) for some >0, then g(n)= O(nlogba). If f(n)= (nlogba), then g(n)= (nlogba lg n). If f(n)= (nlogba+) for some >0 and if af(n/b) cf(n) for some c<1 and all sufficiently large n b, then g(n)= (f(n)). logbn-1  j=0

  21. Proof of Lemma 4.3 For case 1: f(n)=O(nlogba-) implies f(n/bj)=O((n /bj)logba-), so g(n)= ajf(n/bj) =O( aj(n /bj)logba- ) = O(nlogba- aj/(blogba-)j ) = O(nlogba- aj/(aj(b-)j)) = O(nlogba- (b)j ) = O(nlogba- (((b )logbn-1)/(b-1) ) = O(nlogba- (((blogbn)-1)/(b-1)))=O(nlogba n-(n-1)/(b-1)) = O(nlogba ) logbn-1 logbn-1 logbn-1 logbn-1 logbn-1      j=0 j=0 j=0 j=0 j=0

  22. Proof of Lemma 4.3(cont.) For case 2: f(n)= (nlogba) implies f(n/bj)= ((n /bj)logba), so g(n)= ajf(n/bj) = ( aj(n /bj)logba) = (nlogbaaj/(blogba)j ) = (nlogba1) = (nlogba logbn) = (nlogbalg n) logbn-1 logbn-1 logbn-1 logbn-1     j=0 j=0 j=0 j=0

  23. Proof of Lemma 4.3(cont.) For case 3: Since g(n) contains f(n), g(n) = (f(n)) Since af(n/b) cf(n), ajf(n/bj) cjf(n) , why??? g(n)= ajf(n/bj)  cjf(n)  f(n)cj =f(n)(1/(1-c)) =O(f(n)) Thus, g(n)=(f(n)) logbn-1 logbn-1   j=0 j=0   j=0

  24. Proof of Master Theorem (cont.) Lemma 4.4: for T(n) = (1) if n=1 aT(n/b)+f(n) if n=bk for k1 where a 1, b>1, f(n) be a nonnegative function, If f(n)=O(nlogba-) for some >0, then T(n)= (nlogba). If f(n)= (nlogba), then T(n)= (nlogba lg n). If f(n)=(nlogba+) for some >0, and if af(n/b) cf(n) for some c<1 and all sufficiently large n, then T(n)= (f(n)).

  25. Proof of Lemma 4.4 (cont.) Combine Lemma 4.2 and 4.3, For case 1: T(n)= (nlogba)+O(nlogba)=(nlogba). For case 2: T(n)= (nlogba)+(nlogba lg n)=(nlogba lg n). For case 3: T(n)= (nlogba)+(f(n))=(f(n)) because f(n)= (nlogba+).

  26. Floors and Ceilings T(n)=aT(n/b)+f(n) and T(n)=aT(n/b)+f(n) Want to prove both equal to T(n)=aT(n/b)+f(n) Two results: Master theorem applied to all integers n. Floors and ceilings do not change the result. (Note: we proved this by domain transformation too). Since n/bn/b, and n/b n/b, upper bound for floors and lower bound for ceiling is held. So prove upper bound for ceilings (similar for lower bound for floors).

  27. Upper bound of proof for T(n)=aT(n/b)+f(n) consider sequence n, n/b, n/b/b,  n/b/b/b, … Let us define nj as follows: nj = n if j=0 = nj-1/bif j>0 The sequence will be n0, n1, …, nlogbn Draw recursion tree:

  28. The proof of upper bound for ceiling T(n) = (nlogba)+ ajf(nj) Thus similar to Lemma 4.3 and 4.4, the upper bound is proven. logbn-1  j=0

  29. Where Are the Gaps f(n), case 3, at least polynomially larger n Gap between case 3 and 2 c1 f(n), case 2: within constant distances nlogba c2 n Gap between case 1 and 2 f(n), case 1, at least polynomially smaller Note: 1. for case 3, the regularity also must hold. 2. if f(n) is lg n smaller, then fall in gap in 1 and 2 3. if f(n) is lg n larger, then fall in gap in 3 and 2 4. if f(n)=(nlogbalgkn), then T(n)=(nlogbalgk+1n). (as exercise)

  30. A recursion tree for T(n)=T(n/3) + T(2n/3) + cn • # of levels * cost of each level = O(cn log3/2n)=O(n lg n) • Complication • If the tree is complete binary tree, # of leaves = • But the tree is not complete • Go down from the root, more and more internal nodes are absent • Verify O(n lg n) is an upper bound by the substitution method • Exmp1: T(n) = T(2n/3) + 1 (Case 2) • a=1, b=3/2, f(n)=1 • nlogba = nlog3/21 =1, f(n)=1 =(1), • T(n) = (lg n)

  31. Running time of the BS algorithm given below Algorithm recursive Binary-Search-Rec(A[1…n], k, l, r) Input: a sorted array A of n comparable items, search key k, leftmost and rightmost index positions in A Output: Index of array’s element that is equal to k or -1 if k not found functionBSR(X : name; start, end: integer) //l = start; r =end. beginmiddle := (start+end)/2; //while l <= r do ifnames(middle)=xthen returnnumbers(middle); elseifX&ltnames(middle)then returnBSR(X,start,middle-1); else -- X&gtnames(middle) returnBSR(X,middle+1,end); end if; endsearch; You can write BS nonrecursive.. Simple.. Searcing up and down stream of array.. T(n)=T(n/2)+1. worst case not ordered and ordered input cases O(n) and O(lg(n)), respectively. Best case. O(1) for both cases. Recursive or iterative shouldn’t make any difference.

  32. Provided extra file.. algorithm stoogesort(array L, i = 0, j = length(L)-1) if L[j] < L[i] then L[i] ↔ L[j] if j - i > 1 then t = (j - i + 1)/3 stoogesort(L, i , j-t) stoogesort(L, i+t, j ) stoogesort(L, i , j-t) return L

  33. Recurrence Relations Liber Abaci, Fibonacci, the Florentian scholar from the Renaissance studied simplified population growth functions. At some stage starts with a population of 1 pair of rabbits. After a month fertile rabbits produce a newborn pair. Happy they are, they never die and reproduce each month. f(n)=f(n-1)+f(n-2), start from f(0)=f(1)=1.. f(n) ≥ 2 ∗ f(n−2), faster than exponential growth of 2k. The Fibonacci recurrence is an example of a linear homogeneous recurrence relation: an = C1 an-1 + C2 an-2 + . . . + Cr an-r A general solution of this, involves a linear combination of solutions, a(n) = n and there is no f(n). *** Substitute it.. n = C1 n-1 + C2 n-2 + . . . + Cr n-r Characteristic equation is r = C1 r-1 + C2 r-2 + . . . + Cr Assume no complex roots, solutions 1, 2, 3, …, r. then any linear combination is also a general solution. n = A1 1n + A2 2n + . . . + Ar rn

  34. Fibonacci: fn = fn-1 + fn-2,, f0 = f1 =1. *** Substitute k .. n = n-1 + n-22 -  - 1 = 0; Roots of this characteristic equation are 1,2 = (1 + - sqrt(5))/2 An = b ((1+sqrt(5))/2)n + d((1 - sqrt(5))/2)n. Initial values, 1 and 1. b r1n + dr2n = b + d = 1, n=0 An = b ((1+sqrt(5))/2)n + d((1 - sqrt(5))/2)n = 1; n=1 b = (1/sqrt(5))((1+sqrt(5))/2) d = (-1/sqrt(5))((1-sqrt(5))/2) Substitute b, d back to the equation to get specific equation.. fn = (1/sqrt(5))[((1+sqrt(5))/2)n+1 - ((1 - sqrt(5))/2)n+1],

  35. Homogenous First Order Linear Homogeneous Recurrence Relations an = C1 an-1 rewriting with the roots An = C1 n-1 and  = C1, An = 0 (C1)n Example: Compound interest An = An-1 + .3An-1An = A0 (1.3)n Example: Compound interest An = 7An-1; A0 = 5, hence  = 7, an = 5(7)n Second Order Linear Homogeneous Recurrence Relations an = C1 an-1 + C2 an-2 , quadratic, solve two roots, as usual, r1, r2. An = br1n + dr2n. Needs initial values such as b and d.

  36. an = C1 an-1 rewriting with the roots An = C1 n-1 and  = C1, An = 0 (C1)n Example: Compound interest An = An-1 + .3An-1An = A0 (1.3)n Example: Compound interest An = 7An-1; A0 = 5, hence  = 7, an = 5(7)n Second Order Linear Homogeneous Recurrence Relations an = C1 an-1 + C2 an-2 , quadratic, solve two roots, as usual, r1, r2. An = br1n + dr2n. Needs initial values such as b and d.

  37. Inhomogeneous Recurrence. an = can-1 + f(n) Solution for homogeneous side in this case, a0 cn Suppose particular solution for inhomogeneous form is * (n* = cn + *n + f(n)); an = bc + *n; is a general solution. Roots of this characteristic equation are 1,2 = (1 + - sqrt(5))/2 An = b ((1+sqrt(5))/2)n + d((1 - sqrt(5))/2)n. Initial values, 1 and 1. b r1n + dr2n = b + d = 1, n=0 An = b ((1+sqrt(5))/2)n + d((1 - sqrt(5))/2)n = 1; n=1 b = (1/sqrt(5))((1+sqrt(5))/2) d = (-1/sqrt(5))((1-sqrt(5))/2) Substitute b, d back to the equation to get specific equation.. fn = (1/sqrt(5))[((1+sqrt(5))/2)n+1 - ((1 - sqrt(5))/2)n+1]

  38. Quiz Evaluate T(n) of functions given below. • T(n) = T(n/3) + T(2n/3) +c n, Deterministic select algorithm using median. Solve the recurrence for the running time of the algorithm by drawing a recurrence tree. Answer: The height of the tree is at least log3 n, and is at most log3/2 n and the sum of the costs in each level is n. Hence T(n)=(nlgn) • T(n) = 2T(n/2) + O(1) Answer: Case 1: (n), epsilon=1. • T(n) = 4T(n/2) + n2 Answer: Case 2: T(n) = (n2 log n) • What is the upper complexity of T(n)=kn + log8(n2n) + (10n)k. Answer: if c<=1, O(nlog8(n)), otherwise O(kn). • T(n) = 3T(2ceil(n/3)) + 1000 stooge sort.. lg3/2(3) = lg3/lg(3/2)=lg3/(lg3-1) = ~2.72 lg3/2(2*1.5)=lg3/2(2)+lg3/2(1.5)=lg3/2(2)+1=~l.7+1.. Case 1. what is the meaning of this.. • M(n) = M(n-1) + 1, n>0, M(0) = 0, substation method, sums to (n) • A(2k) = A(2k-1)+1, case 2, yields (lgn) • Recursion in d-dimensional mesh (recursion on d-dimensional mesh, d>0) T(n) = T(n/(2d)) + 47d2n1/d h.s =1, T(n)=theta(47d2n1/d), T(n)=theta(47d2n1/d), , case 3. No need to check regularity. • T(n) = T(n/2) + log n (PRAM mergesort) h.s =1, for sufficiently large n, (log n)>1, appealing to T(n)=log n, regularity check af(n/b)cf(n), log(n/2)clog(n), (1-c)lgn1, not possible always to satisfy this relation for large n,, and 0<c<1, (for the regularity condition 0<c<1). Therefore one needs to use other methods, simply check the equation given below (the final equation obtained at the bottom of iterative tree method), there the first part is 1, and the second part is log2n. Therefore the result is theta(log2n). • T(n) = (nlogba)+ Σi=0logbn-1ajf(n/bj) = (1)+ Σi=0lgn-1 log(n/bj)= • =logn Σi=0lgn-1 log(n/2j), lgn-j, lgn-1, lgn-2, lgn-3, .., lgn-lgn+1 = lgnlgn-Σi=0lgn-1 j, (n(n+1)/2), (lg(n)-1)(lgn)/2 (log2n)

  39. wikipedia • Inadmissible equations • The following equations cannot be solved using the master theorem:[2] • *** a is not a constant, n^lg2^n=n^n, the number of children increasing with exp trend • non-polynomial difference between f(n) and n^logba=n • a<1 cannot have less than one sub problem • f(n) is not positive • case 3 but regularity violation.

More Related