1 / 14

Algorithm Complexity & Big-O Notation: From the Basics

ϵ O (. Algorithm Complexity & Big-O Notation: From the Basics. CompSci Club 29 May 2014. History -- Number Theory. ←Edmund Landau (1877-1938). Intelligent mathematician of Germany Supervisor: Frobenius Dirichlet Series

tiana
Download Presentation

Algorithm Complexity & Big-O Notation: From the Basics

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. ϵO ( Algorithm Complexity & Big-O Notation: From the Basics CompSci Club 29 May 2014

  2. History -- Number Theory • ←Edmund Landau (1877-1938) • Intelligent mathematician of Germany • Supervisor: Frobenius • Dirichlet Series • Number theory – over 250 papers, simple proof of the Prime Number Theorem, development on algebraic number fields • *Asymptotic behavior of functions*; O is for Order Image source: http://www.ma.huji.ac.il/~landau/landau.jpg

  3. History -- Application to CS • Big-O Notation -- used to study performance, complexity of algorithms in Comp Sci • Execution Time T(n) • Memory Usage (hard drive, network use, etc) • Performance – what are these variables? • Complexity – how does execution time change with greater amnt of data? • Amortized analysis – studying the worst case scenario of algorithms, using big-O notation, determining complexity

  4. Definition & Notations, I • If there is number N and number c such that: • f(x) ≤ c*g(x) for all x > N Then we can write: f(N) ϵO (g(N)) • N – problem size, input size, list size • We see various examples, as different functions: O (N3) O (aN) O (log (N))

  5. Definitions & Notations, II • f(n) ϵO (g(x)), f(x) ≤ c*g(x) • f(n) ϵΘ (g(x)), f(x) = c*g(x) • f(n) ϵΩ (g(x)), f(x) ≥ c*g(x) *Note: in many texts these will be ‘equals’ signs, though many mathematicians such as myself find this to be inadequate notation (the ‘equals’ operator implies true converses, which is not true in all cases)

  6. For Those of You in Calc Class… • If we know that: lim_f(x)_ x→∞ g(x) …then f(x) = o (g(x)). • However! This is actually little-oh notation (a stricter quality of Big-Oh Notation) • There exists a number N such that f(x) < c*g(x) for x > N and for all values of c. = 0 ,

  7. Example Problems • You may note that, in the coming examples, constant values don’t end up mattering very much. • Dept. CS at Univ. Wisconsin-Madison describes / proves the evaluation of complexity • Summing up the times of each statement public void testComplexity () { statement1; statement2; … } Solely a function of number of statements (N) … = O (1)

  8. Some More Complicated Examples • for-loop complexity • Proportional to the upper index of the loop • Nested for-loops, each starting at int ** = 0: • Proportional to O (N*M) or O (N2) if N=M = O (N*M) = O (N)

  9. Some More Complicated Examples for (int k = 0; k < N; k++) { for (int j = k; j < N; j++) { statements; } } 1 + 2 + 3 + 4 + … + N = n(n+1)/2 = (1/2)(n2 ± n) = O (N2)

  10. Some Practice Problems What is the worst-case complexity of the each of the following code fragments? Two loops in a row: for (i = 0; i < N; i++) { sequence of statements } for (j = 0; j < M; j++) { sequence of statements } How would the complexity change if the second loop went to N instead of M? http://pages.cs.wisc.edu/~vernon/cs367/notes/3.COMPLEXITY.html

  11. Some Practice Problems A nested loop followed by a non-nested loop: for (i = 0; i < N; i++) { for (j = 0; j < N; j++) { sequence of statements } } for (k = 0; k < N; k++) { sequence of statements } A nested loop in which the number of times the inner loop executes depends on the value of the outer loop index: for (i = 0; i < N; i++) { for (j = N; j > i; j--) { sequence of statements } } http://pages.cs.wisc.edu/~vernon/cs367/notes/3.COMPLEXITY.html

  12. When Does a Constant Matter? • One can study time functions with greater specificity, for smaller differences in complexity • T1 (N) = kN; T2 (N) = aNb, b > 1 T1 can become more efficient than T2 after a certain number of trials

  13. Some Well-known Algorithms& Their Complexities • From Wikipedia: • Constant time: Size of array • Logarithmic: BinarySearch Algorithm • Quadratic: Bubble Sort & Insertion Sort • …And these can be verified by hand (Binary Search, List size, insertion sort, etc) = O (1) = O (log(N)) = O (N2)

  14. Sources • Cited http://web.mit.edu/16.070/www/lecture/big_o.pdf http://pages.cs.wisc.edu/~vernon/cs367/notes/3.COMPLEXITY.html http://www-history.mcs.st-andrews.ac.uk/Biographies/Landau.html http://en.wikipedia.org/wiki/Time_complexity#Table_of_common_time_complexities http://mthcompsci.wordpress.com/

More Related