1 / 15

Big-Oh Notation and Asymptotic Analysis in Algorithm Analysis

Learn about the Big-Oh notation and asymptotic analysis in algorithm analysis, including the major notations and their definitions, properties, and examples. Understand how to classify algorithms based on their growth rates and justify the use of Big-Oh notation.

Download Presentation

Big-Oh Notation and Asymptotic Analysis in Algorithm Analysis

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Department of Computer and Information Science,School of Science, IUPUI CSCI 240 Analysis of Algorithms Big-Oh Dale Roberts, Lecturer Computer Science, IUPUI E-mail: droberts@cs.iupui.edu

  2. Asymptotic Analysis Ignoring constants in T(n) Analyzing T(n) as n "gets large" Example: The big-oh (O) Notation

  3. 3 major notations • Ο(g(n)), Big-Oh of g of n, the Asymptotic Upper Bound. • W(g(n)), Big-Omega of g of n, the Asymptotic Lower Bound. • Q(g(n)), Big-Theta of g of n, the Asymptotic Tight Bound.

  4. Big-Oh Defined • The O symbol was introduced in 1927 to indicate relative growth of two functions based on asymptotic behavior of the functions now used to classify functions and families of functions T(n) = O(f(n)) if there are constants c and n0 such that T(n) < c*f(n) when n  n0 c*f(n) is an upper bound for T(n) c*f(n) T(n) n0 n

  5. Big-Oh • Describes an upper bound for the running time of an algorithm • Upper bounds for Insertion Sort running times: • worst case: O(n2) T(n) = c1*n2 + c2*n + c3 • best case: O(n) T(n) = c1*n + c2 Time Complexity

  6. Big-O Notation • We say Insertion Sort’s run time is O(n2) • Properly we should say run time is in O(n2) • Read O as “Big-Oh” (you’ll also hear it as “order”) • In general a function • f(n) is O(g(n)) if there exist positive constants c and n0such that f(n)  c  g(n) for all n  n0 • e.g. if f(n)=1000n and g(n)=n2, n0= 1000 andc = 1 then f(n) < 1*g(n) where n >n0and we say that f(n) = O(g(n)) • The O notation indicates 'bounded above by a constant multiple of.'

  7. Big-Oh Properties • Fastest growing function dominates a sum • O(f(n)+g(n)) is O(max{f(n), g(n)}) • Product of upper bounds is upper bound for the product • If f is O(g) and h is O(r) then fh is O(gr) • f is O(g) is transitive • If f is O(g) and g is O(h) then f is O(h) • Hierarchy of functions • O(1), O(logn), O(n1/2), O(nlogn), O(n2), O(2n), O(n!)

  8. Some Big-Oh’s are not reasonable • Polynomial Time algorithms • An algorithm is said to be polynomial if it is O( nc ), c >1 • Polynomial algorithms are said to be reasonable • They solve problems in reasonable times! • Coefficients, constants or low-order terms are ignored e.g. if f(n) = 2n2 then f(n) = O(n2) • Exponential Time algorithms • An algorithm is said to be exponential if it is O( rn ), r >1 • Exponential algorithms are said to be unreasonable

  9. Can we justify Big O notation? • Big O notation is a huge simplification; can wejustify it? • It only makes sense for large problem sizes • For sufficiently large problem sizes, thehighest-order term swamps all the rest! • ConsiderR = x2 + 3x + 5 as x varies: x = 0 x2 = 0 3x = 10 5 = 5 R = 5 x = 10 x2 = 100 3x = 30 5 = 5 R = 135 x = 100 x2 = 10000 3x = 300 5 = 5 R = 10,305 x = 1000 x2 = 1000000 3x = 3000 5 = 5 R = 1,003,005 x = 10,000 R = 100,030,005 x = 100,000 R = 10,000,300,005

  10. Classifying Algorithms based on Big-Oh • A function f(n) is said to be of at most logarithmic growth if f(n) = O(log n) • A function f(n) is said to be of at most quadratic growth if f(n) = O(n2) • A function f(n) is said to be of at most polynomial growth if f(n) = O(nk), for some natural number k > 1 • A function f(n) is said to be of at most exponential growth if there is a constant c, such that f(n) = O(cn), and c > 1 • A function f(n) is said to be of at most factorial growth if f(n) = O(n!). • A function f(n) is said to have constant running time if the size of the input n has no effect on the running time of the algorithm (e.g., assignment of a value to a variable). The equation for this algorithm is f(n) = c • Other logarithmic classifications: f(n) = O(n log n) f(n) = O(log log n)

  11. Rules for Calculating Big-Oh Base of Logs ignored logan = O(logbn) Power inside logs ignored log(n2) = O(log n) Base and powers in exponents not ignored 3n is not O(2n) 2 a(n ) is not O(an) If T(x) is a polynomial of degree n, then T(x) = O(xn)

  12. Big-Oh Examples 1. 2n3 + 3n2 + n= 2n3 + 3n2 + O(n) = 2n3 + O( n2 + n) = 2n3 + O( n2 ) = O(n3 ) = O(n4) 2. 2n3 + 3n2 + n = 2n3 + 3n2 + O(n) = 2n3 + O(n2 + n) = 2n3 + O(n2) = O(n3)

  13. Big-Oh Examples (cont.) 3. Suppose a program P is O(n3), and a program Q is O(3n), and that currently both can solve problems of size 50 in 1 hour. If the programs are run on another system that executes exactly 729 times as fast as the original system, what size problems will they be able to solve?

  14. Big-Oh Examples (cont) n3 = 503* 729 3n = 350* 729 n = n = log3 (729 * 350) n = log3(729) + log3 350 n = 50 * 9 n = 6 + log3 350 n = 50 * 9 = 450 n = 6 + 50 = 56 • Improvement: problem size increased by 9 times for n3 algorithm but only a slight improvement in problem size (+6) for exponential algorithm.

  15. Acknowledgements • Philadephia University, Jordan • Nilagupta, Pradondet

More Related