1 / 39

Problem of the Day

Problem of the Day. On the next slide I wrote today’s problem of the day. It has 3 possible answers. Can you guess which 1 of the following is the solution? Answer 1 Answers 1 or 2 Answer 2 Answers 2 or 3. Problem of the Day.

Download Presentation

Problem of the Day

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Problem of the Day • On the next slide I wrote today’s problem of the day. It has 3 possible answers. Can you guess which 1 of the following is the solution? • Answer 1 • Answers 1 or 2 • Answer 2 • Answers 2 or 3

  2. Problem of the Day • On the next slide I wrote today’s problem of the day. It has 3 possible answers. Can you guess which 1 of the following is the solution? • Answer 1 • Answers 1 or 2 • Answer 2 • Answers 2 or 3 • If answers 1 or 2 were correct, we would not be able to select exactly one solution. So, answer 3(and selection D) must be right.

  3. CSC 212 – Data Structures Lecture 21:Big-Oh Complexity

  4. Analysis Techniques • Running time is critical, … • …but comparing times impossible in many cases • Single problem may have lots of ways to be solved • Many implementations possible for each solution

  5. Pseudo-Code • Only for human eyes • Unimportant & implementation details ignored • Serves very real purpose, even if it is not real • Useful for tasks like outlining, designing, & analyzing • Language-like manner to system, though not formal

  6. Pseudo-Code • Only needs to include details needed for tracing • Loops, assignments, calls to methods, etc. • Anything that would be helpful analyzing algorithm • Understanding algorithm is only goal • Feel free to ignore punctuation & other formalisms • Understanding & analysis is only goal of using this

  7. Pseudo-code Example Algorithm factorial(intn) returnVariable= 1 while(n > 0) returnVariable=returnVariable*n n=n– 1endwhile returnreturnVariable

  8. “Anything that can go wrong…” • Expresses an algorithm’s complexity • Worst-case analysis of algorithm performance • Usually closely correlated with execution time • Not always right toconsider only worst-case • May be situation where worst-case is very rare • Closely related approaches for other cases come later

  9. “Should I Even Bother?” • Compare algorithms using big-Oh notation • Could use to compare implementations, also • Saves time implementing all the algorithms • Biases like CPU, typing speed, cosmic rays ignored

  10. Algorithmic Analysis

  11. Algorithm Analysis • Execution time with n inputs on 4GHz machine:

  12. Big-Oh Notation • Want results for large data sets • Nobody caresabout 2 minute-long program • Limit considerations to only major details • Ignore multipliers • So, O(⅛n) = O(5n) = O(50000n) = O(n) • Multipliers usually implementation-specific • How many 5ms can we fit into 4 minutes? • Ignore lesser terms • So, O(⅚n5 + 23402n2) = O(n5 + n2) = O(n5) • Tolerate extra 17 minutesafter waiting 3x1013 years?

  13. What is n? • Big-Oh analysis always relative to input size • But determining input size is not always clear • Quick rules of thumb: • Need to consider what algorithm is processing Analyze values below x: n= xAnalyze data in an array: n= size of arrayAnalyze linked list: n= size of linked listAnalyze 2 arrays: n= sum of array sizes

  14. Analyzing an Algorithm • Big-Oh computes primitive operations executed • Assignments • Calling a method • Performing arithmetic operation • Comparing two values • Getting entry from an array • Following a reference • Returning a value from a method • Accessing a field

  15. Primitive Statements • Basis of programming, take constant time:O(1) • Fastest possible big-Oh notation • Time to run sequence of primitive statements, too • But only if the input does not affect sequence Ignore constant multiplier O(5) = O(5* 1) = O(1)

  16. Simple Loops for (inti = 0; i < n.length; i++){} -or- while (i < n) { i++; } • Each loop executed ntimes • Primitive statements only within body of loop • Big –oh complexity of single loop iteration: O(1) • Either loop runs O(n) iterations • So loop has O(n) * O(1) = O(n) complexity total

  17. Loops In a Row for (inti = 0; i < n.length; i++){} inti = 0; while (i < n) { i++; } Add complexities of sequences to compute total For this example, total big-Oh complexity is: = O(n) + O(1) + O(n) = O(2 * n + 1) = O(n + 1)

  18. Loops In a Row for (inti = 0; i < n.length; i++){} inti = 0; while (i < n) { i++; } Add complexities of sequences to compute total For this example, total big-Oh complexity is: = O(n) + O(1) + O(n) = O(2 * n + 1) = O(n + 1)

  19. Loops In a Row for (inti = 0; i < n.length; i++){} inti = 0; while (i < n) { i++; } Add complexities of sequences to compute total For this example, total big-Oh complexity is: = O(n) + O(1) + O(n) = O(2 * n + 1) = O(n)

  20. More Complicated Loops for (inti = 0; i < n; i+= 2) { } i 0, 2, 4, 6, ..., n • In above example, loop executes n/2iterations • Iterations takes O(1)time, so total complexity: = O(n/2) * O(1) = O(n * ½*1) = O(n)

  21. Really Complicated Loops for (inti = 1; i < n; i*= 2) { } i 1, 2, 4, 8, ..., n • In above code, loop executes log2niterations • Iterations takes O(1)time, so total complexity: = O(log2n) * O(1) = O(log2n * 1) = O(log2n)

  22. Really Complicated Loops for (inti = 1; i < n; i*= 3) { } i 1, 3, 9, 27, ..., n • In above code, loop executes log3niterations • Iterations takes O(1)time, so total complexity: = O(log3n) * O(1) = O(log3n * 1) = O(log3n)

  23. Math Moment • All logarithms are related, no matter the base • Change base for an answer using constant multiple • But ignore constant multiple using big-Oh notation • So can consider allO(log n)solutions identical

  24. Nested Loops for (inti = 0; i < n; i++){ for (int j = 0; j < n; j++) { } } • Program would execute outer loop ntimes • Inner loop run n timeseach iteration of outer loop • O(n)iterations doing O(n)work each iteration • So loop has O(n) * O(n) = O(n2) complexity total • Loops complexity multiplies when nested

  25. + • Only care about approximates on huge data sets • Ignore constant multiples • Drop lesser terms (&n! > 2n > n5 > n2 > n > log n > 1) • O(1)time for primitive statements to execute • Change by constant amount in loop: O(n)time • O(log n)time if multiply by constant in loop • Ignore constants: does not matter what constant is • When code is sequential, addtheir complexities • Complexities are multiplied when code is nested

  26. It’s About Time Algorithm sneaky(intn)total= 0for i = 0 to ndofor j = 0 to ndototal += i * jreturn totalend forend for • sneaky would take _____ time to execute • O(n)iterations for each loop in the method

  27. It’s About Time Algorithm sneaky(intn)total= 0for i = 0 to ndofor j = 0 to ndototal += i * jreturn totalend forend for • sneaky would take O(1) time to execute • O(n)iterations for each loop in the method • But in first pass, method ends after return • Always executes same number of operations

  28. Big-Oh == Murphy’s Law Algorithm power(inta, intb ≥ 0)ifa == 0 &&b == 0 thenreturn -1endifexp = 1repeatb timesexp *= aend repeatreturnexp • powertakes O(n)time in most cases • Would only take O(1) if a&bare 0 • ____algorithm overall

  29. Big-Oh == Murphy’s Law Algorithm power(inta, intb ≥ 0)ifa == 0 &&b == 0 thenreturn -1endifexp = 1repeatb timesexp *= aend repeatreturnexp • powertakes O(n)time in most cases • Would only take O(1) if a&bare 0 • O(n) algorithm overall; big-Oh uses worst-case

  30. How Big Am I? algorithm sum(int[][] a)total = 0for i = 0 to a.lengthdofor j = 0 to a[i].lengthdototal += a[i][j]end forend forreturn total • Despite nested loops, this runs in O(n) time • Input is doubly-subscripted array for this method • For this methodnis number entries in array

  31. Handling Method Calls • Method call is O(1)operation, … • … but then also need to add time running method • Big-Oh counts operations executed in total • Remember: there is no such thing as free lunch • Borrowing $5 to pay does not make your lunch free • Similarly, need to include all operations executed • In which method run DOES NOT MATTER

  32. Methods Calling Methods public static intsumOdds(intn) {int sum = 0; for (inti = 1; i <= n; i+=2) { sum+=i; }return sum; } public static void oddSeries(intn) {for (inti = 1; i < n; i++) {System.out.println(i + “ ” + sumOdds(n));} } • oddSeriescalls sumOddsn times • Each call does O(n) work, so takes O(n2) total time!

  33. Justifying an Answer • Important to explain your answer • Saying O(n) not enough to make it O(n) • Methods using recursion especially hard to determine • Derive difficult answer using simple process

  34. Justifying an Answer • Important to explain your answer • Saying O(n) not enough to make it O(n) • Methods using recursion especially hard to determine • Derive difficult answer using simple process

  35. Justifying an Answer • Important to explain your answer • Saying O(n) not enough to make it O(n) • Methods using recursion especially hard to determine • Derive difficult answer using simple process • May find that you can simplify big-Oh computation • Find smaller or larger big-Oh than imagined • Can be proof, but need not be that formal • Explaining your answer is critical for this • Helps you be able to convince others

  36. Big-Oh Notation Algorithm factorial(intn) if n <= 1 then return 1elsefact = factorial(n – 1)return n * factendif • Ignoring recursive calls cost, runs in O(1) time • At most n – 1 calls since n decreased by 1 each time • Method’s total complexity is O(n) • Runs O(n – 1) * O(1)= O(n - 1) = O(n)operations

  37. Big-Oh Notation Algorithm fib(intn) ifn <= 1 then return nelsereturn fib(n-1) + fib(n-2)endif • O(1) time for each O(2n) calls = O(2n) complexity • Calls fib(1), fib(0)when n = 2 • n = 3, total of 4 calls: 3 for fib(2)+ 1 for fib(1) • n = 4, total of 8 calls: 5 for fib(3)+ 3 for fib(2) • Number calls 2x whennincremented = O(2n)

  38. Your Turn • Get into your groups and complete activity

  39. For Next Lecture • Read GT5.1 – 5.1.1, 5.1.4, 5.1.5 for Friday's class • What is an ADT and how are they defined? • How does a Stack work? • Also available is week #8 weekly assignment • Programming assignment #1 also on Angel • Pulls everything together and shows off your stuff • Better get moving on it, since due on Monday

More Related