310 likes | 439 Views
Algorithms and Data Structures. Lecture 2. Agenda:. Algorithm Evaluation: continued Asymptotic Efficiency Recursive functions, Recurrent Equations HM Assignments. Algorithms Evaluation. T(n) = c, constant function T(n) = cn, linear function T(n) = c log(n), logarithmic function
E N D
Algorithms and Data Structures Lecture 2
Agenda: • Algorithm Evaluation: continued • Asymptotic Efficiency • Recursive functions, Recurrent Equations • HM Assignments
Algorithms Evaluation • T(n) = c, constant function • T(n) = cn, linear function • T(n) = c log(n), logarithmic function • T(n) = cn log(n) • T(n) = cn2, quadratic function • T(n) = cn3, cubic function • T(n) = c2n, exponential function
Algorithms Evaluation • Sample: how running time growths with grow of input size, e.g. n=k, k2 • T(n)=c, constant function, does not depend on “n” • T(k)=T(kk)
Algorithms Evaluation • T(n)=c log(n), logarithmic function • T(kk)=c log(k2)=2c log(k) • T(k)=c log(k) • T(kk)/T(k)=2 times always, does not depend on “n” • n=5, T(25)/T(5)=2 times • n=10, T(100)/T(10)=2 times
Algorithms Evaluation • T(n)=cn, linear function • T(kk)=ck2 • T(k)=ck • T(kk)/T(k)=k times, depends on “n” • n=5, T(25)/T(5)=5 times • n=10, T(100)/T(10)=10 times
Algorithms Evaluation • T(n)=cn log(n) • T(kk)=ck2 log(k2)=2ck2 log(k) • T(k)=ck log(k) • T(kk)/T(k)=2k times, depends on “n” • n=5, T(25)/T(5)=10 times • n=10, T(100)/T(10)=20 times
Algorithms Evaluation • T(n)=cn2, quadratic function • T(kk)=c(k2)2 • T(k)=ck2 • T(kk)/T(k)=k2 times, depends on “n” • n=5, T(25)/T(5)=25 times • n=10, T(100)/T(10)=100 times
Algorithms Evaluation • T(n)=cn3, cubic function • T(kk)=c(k2)3 • T(k)=ck3 • T(kk)/T(k)=k3 times, depends on “n” • n=5, T(25)/T(5)=125 times • n=10, T(100)/T(10)=1000 times
Algorithms Evaluation • T(n)=c2n, exponential function • T(kk)=c2kk • T(k)=c2k • T(kk)/T(k)=2k(k-1) times, depends on “n” • n=5, T(25)/T(5)=220=1048576 times
Algorithms Evaluation • Sample: there are two algorithms that perform the same task • T1(n)=n, T2(n)=n2 • T2(n)/T1(n)=n times • n=5, T2(5)/T1(5)=5 times • n=100, T2(100)/T1(100)=100 times
Asymptotic Efficiency • More general quantity than BCRT, WCRT or ACRT • Asymptotic Efficiency is used to evaluate growth of function’s running time when INPUT size tends to infinity (∞) • When some algorithm is evaluated in terms of AE it is always assumed that n→∞ • Asymptotic Efficiency is denoted by symbol Θ, pronounced as [teta]
Asymptotic Efficiency • Each T(n) can be associated with Θ function • A number of T(n) functions which belong to the same family of functions have the same Θ function • E.g. T1(n)=a1n+b, T2(n)=a2n and T2(n)=n belong to the family of linear functions and have the same Asymptotic Efficiency; T1(n)=Θ(n), T2(n)=Θ(n), T3(n)=Θ(n) • It does not mean that T1(n)= T2(n)= T3(n)
Asymptotic Efficiency • Exact mathematical meaning of f(n)=Θ(g(n)) is the following: exist c1>0, c2>0 and n0 that 0 ≤ c1g(n) ≤ f(n) ≤c2g(n) for any n ≥ n0 • It is assumed that both functions f(n) and g(n) are asymptotically nonnegative, i.e. nonnegative for n→∞ • If functions f(n) and g(n) are positive then n0 can be excluded from the definition • If f(n)=Θ(g(n)) then we say that g(n) is assymptotically tight bound for f(n) • If f(n)=Θ(g(n)) then g(n)=Θ(f(n))
Asymptotic Efficiency • If two algorithms solving the same task are Θ1, Θ2 and Θ1 < Θ2 then the first algorithm is generally more efficient (for n→∞) • Sample: T(n)=0.5n2 - 3n, let’s verify that T(n) = Θ(n2), where g(n)=n2 • Accordingly to the definition of Θ we have to find such constants c1,c2 >0 and number n0, so the expression c1n2 ≤ 0.5n2 - 3n ≤ c2n2 would be valid for any n ≥ n0
Asymptotic Efficiency • c1n2 ≤ 0.5n2 - 3n ≤ c2n2 | : n2 • c1 ≤ 0.5– 3/n ≤ c2 • From c1,c2 >0 => 0.5– 3/n >0 • And therefore n>6, n є [ 7, +∞) • Let’s take n0=7 • Taking minimal n=7, c1 ≤ 0.5– 3/7 • c1 ≤ 1/14, let’s take c1 = 1/14
Asymptotic Efficiency • Taking maximal n→∞, lim(0.5– 3/n)=1/2 • 1/2 ≤ c2 , let’s take c2 =1/2 • So the constants c1, c2 and value of n0 were successfully found, therefore T(n)= Θ(n2) • In order to prove that some Θ is not an asymptotic efficiency of T(n), it enough to demonstrate that any of the conditions is not satisfied
Asymptotic Efficiency • While calculating Θ, members, having rank lower than the rank of main member, are not taken into account • Constant of a main member is not very important, it just makes sense while choosing constants c1 and c2 • T(n)=an2+bn+c, where a, b and c are constants and a>0, omitting members bn, c and constant a, we get g(n)= n2 and therefore T(n)= Θ(n2)
Asymptotic Efficiency • By Θ(1) we denote asymptotic efficiency of some function limited for arguments close to infinity • T(n)=Θ(g(n)) expresses both upper and lower limits of growth • T(n)=O(g(n)) is upper limit of growth (ou) • T(n)=Ω(g(n)) is lower limit of growth (omega)
Asymptotic Efficiency • We say that f(n)=O(g(n)) if exists some constant c>0 and n0 that 0 ≤ f(n) ≤cg(n) for any n ≥ n0 • It is assumed that functions f(n) and g(n) are asymptotically nonnegative, i.e. nonnegative for n→∞ • If functions f(n) and g(n) are positive then n0 can be excluded from the definition
Asymptotic Efficiency • We say that f(n)=Ω(g(n)) if exists some constant c>0 and n0 that 0 ≤ cg(n) ≤ f(n) for any n ≥ n0 • It is assumed that functions f(n) and g(n) are asymptotically nonnegative, i.e. nonnegative for n→∞ • If functions f(n) and g(n) are positive then n0 can be excluded from the definition
Asymptotic Efficiency • Theorem 1: For any functions f(n) and g(n) f(n)=Θ(g(n)) if f(n)=O(g(n)) and f(n)=Ω(g(n)) • Theorem 2: For any functions f(n) and g(n) the following two statements are equal f(n)=O(g(n)) and g(n)=Ω(f(n))
Asymptotic Efficiency • If f(n)=Θ(n) => f(n)=O(n), f(n)=Ω(n), f(n)=O(n2), f(n)=O(n3) … • But f(n)≠Ω(n2), f(n)≠Ω(n3) … • If f(n)=Θ(n3) => f(n)=O(n3), f(n)=Ω(n3), f(n)=Ω(n2), f(n)=Ω(n) • But f(n)≠O(n2), f(n)≠O(n) …
Asymptotic Efficiency • f(n)=O(g(n)) means that expression f(n)/g(n) becomes limited while n→∞ • If lim(f(n)/g(n))=0 while n→∞ then f(n)=o((g(n)) • f(n)=Ω (g(n)) means that expression g(n)/f(n) becomes limited while n→∞ • If lim(g(n)/f(n))=0 while n→∞ then f(n)=w((g(n))
Asymptotic Efficiency: properties • Transitivity: • f(n)= Θ(g(n)) & g(n)= Θ(h(n)) => f(n)= Θ(h(n)) • f(n)= Ω(g(n)) & g(n)= Ω(h(n)) => f(n)= Ω(h(n)) • f(n)= O(g(n)) & g(n)= O(h(n)) => f(n)= O(h(n)) • f(n)= o(g(n)) & g(n)= o(h(n)) => f(n)= o(h(n)) • f(n)= w(g(n)) & g(n)= w(h(n)) => f(n)= w(h(n))
Asymptotic Efficiency: properties • Reflexivity: f(n)=Θ(f(n)) => f(n)=Ω(f(n)), f(n)=O(f(n)) • Symmetry: f(n)= Θ(g(n)) g(n)= Θ(f(n)) • Inversion: • f(n)=O(g(n)) g(n)=Ω(f(n)) • f(n)=o(g(n)) g(n)=w(f(n))
Asymptotic Efficiency: properties • If c>0 some constant and f=O(g) => cf= O(g), constant does not play any role • If f=O(g) and h=O(g) => (f+h)=O(g) • nr=O(ns), 0 ≤ r ≤ s • If p(n) is a polynomial of rank d, p(n)=Θ (nd) • f=O(g) & h=O(r) => f*h=O(g*r)
Asymptotic Efficiency: properties • nk = O(bn), if b>1 and k ≥ 0 • log(n) = O(n) • logbn = O(nk), if b>1 and k > 0 • logbn = Θ(logdn), if b,d>1 • 1r + 2r +…+ nr = Θ(nr+1)