470 likes | 664 Views
Complexity Analysis : Asymptotic Analysis. Nattee Niparnan. Recall. What is the measurement of algorithm? How to compare two algorithms? Definition of Asymptotic Notation. Today Topic. Finding the asymptotic upper bound of the algorithm. Interesting Topics of Upper Bound. Rule of thumb!
E N D
Complexity Analysis : Asymptotic Analysis NatteeNiparnan
Recall • What is the measurement of algorithm? • How to compare two algorithms? • Definition of Asymptotic Notation
Today Topic • Finding the asymptotic upperbound of the algorithm
Interesting Topics of Upper Bound • Rule of thumb! • We neglect • Lower order terms from addition • E.g. n3+n2 = O(n3) • Constant • E.g. 3n3 = O(n3) Remember that we use = instead of (more correctly)
Why Discard Constant? • From the definition • We can use any constant • E.g. 3n = O(n) • Because • When we let c >= 3, the condition is satisfied
Why Discard Lower Order Term? • Consider • f(n) = n3+n2 • g(n) = n3 • If f(n) = O(g(n)) • Then, for some c and n0 • c * g(n)-f(n) > 0 • Definitely, just use any c >1
Why Discard Lower Order Term? • 1.1 * g(n)-f(n) = 0.1n3-n2 • Try c = 1.1 • Does 0.1n3-n2 > 0 ? • It is when • 0.1n > 1 • E.g., n > 10 • 0.1n3-n2 > 0 • 0.1n3 > n2 • 0.1n3/n2 > 1 • 0.1n > 1
Lower Order only? • In fact, • It’s only the dominant term that count • Which one is dominating term? • The one that grow faster • Why? • Eventually, it is g*(n)/f*(n) • If g(n) grows faster, • g(n)/f*(n) > some constant • E.g, lim g(n)/f*(n) infinity The non-dominant term The dominant term
What dominating what? Left side dominates
Putting into Practice • What is the asymptotic class of • 0.5n3+N4-5(n-3)(n-5)+n3log8n+25+n1.5 • (n-5)(n2+3)+log(n20) • 20n5+58n4+15n3.2*3n2
Putting into Practice • What is the asymptotic class of • 0.5n3+N4-5(n-3)(n-5)+n3log8n+25+n1.5 • (n-5)(n2+3)+log(n20) • 20n5+58n4+15n3.2*3n2 O(n4) O(n3) O(n5.4)
Asymptotic Notation from Program Flow • Sequence • Conditions • Loops • Recursive Call
Sequence Block A f (n) f(n) + g(n) = O(max (f(n),g(n)) Block B g (n)
Example Block A O(n) O(n2) Block B O(n2)
Example Block A Θ(n) O(n2) Block B O(n2)
Example Block A Θ(n) Θ(n2) Block B Θ(n2)
Example Block A O(n2) Θ(n2) Block B Θ(n2)
Condition O(max (f(n),g(n)) Block A Block B f (n) g (n)
Loops for (i = 1;i <= n;i++) { P(i) } Let P(i) takes timeti
Example for (i = 1;i <= n;i++) { sum += i; } sum += iΘ(1)
Why don’t we use max(ti)? • Because the number of terms is not constant for (i = 1;i <= n;i++) { sum += i; } Θ(n) for (i = 1;i <= 100000;i++) { sum += i; } Θ(1) With big constant
Example for (j = 1;j <= n;j++) { for (i = 1;i <= n;i++) { sum += i; } } sum += iΘ(1)
Example for (j = 1;j <= n;j++) { for (i = 1;i <= j;i++) { sum += i; } } sum += iΘ(1)
Example : Another way for (j = 1;j <= n;j++) { for (i = 1;i <= j;i++) { sum += i; } } sum += iΘ(1)
Example for (j = 2;j <= n-1;j++) { for (i = 3;i <= j;i++) { sum += i; } } sum += iΘ(1)
Example : While loops While (n > 0) { n = n - 1; } Θ(n)
Example : While loops While (n > 0) { n = n - 10; } Θ(n/10) = Θ(n)
Example : While loops While (n > 0) { n = n / 2; } Θ(log n)
Example : Euclid’s GCD function gcd(a, b) { while (b > 0) { tmp = b b = a mod b a = tmp } return a }
Example : Euclid’s GCD Until the modding one is zero function gcd(a, b) { while (b > 0) { tmp = b b = a mod b a = tmp } return a } How many iteration? Compute mod and swap
Example : Euclid’s GCD a b a mod b If a > b a mod b < a / 2
Case 1: b > a / 2 a b a mod b
Case 1: b ≤ a / 2 a b We can always put another b a mod b
Example : Euclid’s GCD function gcd(a, b) { while (b > 0) { tmp = b b = a mod b a = tmp } return a } O( log n) B always reduces at least half
Theorem • If Σai <1 then • T(n)= ΣT(ain) + O(N) • T(n) = O(n) T(n) = T(0.7n) + T(0.2n) + T(0.01)n + 3n = O(n)
Recursion try( n ){ if ( n <= 0 ) return 0; for ( j = 1; j <= n ; j++) sum += j; try (n * 0.7) try (n * 0.2) }
Recursion try( n ){ if ( n <= 0 ) return 0; for ( j = 1; j <= n ; j++) sum += j; try (n * 0.7) try (n * 0.2) } Θ(1) terminating process T(n) Θ(n) recursion T(0.7n) + T(0.2n) T(n) = T(0.7n) + T(0.2n) + O(n)
Guessing and proof by induction • T(n) = T(0.7n) + T(0.2n) + O(n) • Guess: T(n) = O(n), T(n) ≤ cn • Proof: • Basis: obvious • Induction: • Assume T(i < n) = O(i) • T(n) ≤ 0.7cn + 0.2cn + O(n) • = 0.9cn + O(n) • = O(n) <<< dominating rule
Using Recursion Tree • T(n) = 2 T(n/2) + n n n Lg n n/2 n/2 2 n/2 n/4 n/4 n/4 n/4 4 n/4 T(n) = O(n lg n)
Master Method • T(n) = aT(n/b) + f (n) a ≥ 1, b > 1 • Let c = logb(a) • f (n) = Ο( nc - ε ) → T(n) = Θ( nc ) • f (n) = Θ( nc ) → T(n) = Θ( nclog n) • f (n) = Ω( nc + ε ) → T(n) = Θ( f (n) )
Master Method : Example • T(n) = 9T(n/3) + n • a = 9, b = 3, c = log 3 9 = 2, nc = n2 • f (n) = n = Ο( n2 - 0.1 ) • T(n) = Θ(nc) = Θ(n2)
Master Method : Example • T(n) = T(n/3) + 1 • a = 1, b = 3, c = log 3 1 = 0, nc = 1 • f (n) = 1 = Θ( nc ) = Θ( 1 ) • T(n) = Θ(nc log n) = Θ( log n)
Master Method : Example • T(n) = 3T(n/4) + n log n • a = 3, b = 4, c = log 4 3 < 0.793, nc < n0.793 • f (n) = n log n = Ω( n0.793 ) • a f (n/b) = 3 ((n/4) log (n/4) ) ≤ (3/4) n log n = d f (n) • T(n) = Θ( f (n) ) = Θ( n log n)
Conclusion • Asymptotic Bound is, in fact, very simple • Use the rule of thumbs • Discard non dominant term • Discard constant • For recursive • Make recurrent relation • Use master method • Guessing and proof • Recursion Tree