290 likes | 372 Views
Question of the Day. A friend tells the truth when saying: A road near my house runs directly north-south; I get on the road facing north , drive for a mile , & end up south of where I started How does he do it?. Question of the Day.
E N D
Question of the Day A friend tells the truth when saying:A road near my house runs directly north-south; I get on the road facing north, drive for a mile, & end up south of where I started How does he do it?
Question of the Day A friend tells the truth when saying:A road near my house runs directly north-south; I get on the road facing north, drive for a mile, & end up south of where I started How does he do it? Note: He (anyone) doesn't live near the north pole
Question of the Day A friend tells the truth when saying:A road near my house runs directly north-south; I get on the road facing north, drive for a mile, & end up south of where I started How does he do it? Note: He (anyone) doesn't live near the north pole Your friend drives in reverse!
Analysis Techniques • Running time is critical, … • …but comparing times impossible in many cases • Single problem may have lots of ways to be solved • Many implementations possible for each solution
Pseudo-Code • Only for human eyes • Unimportant & implementation details ignored • Serves very real purpose, even if it is not real • Useful for tasks like outlining, designing, & analyzing • Language-like manner to system, though not formal
Pseudo-Code • Only needs to include details needed for tracing • Loops, assignments, calls to methods, etc. • Anything that would be helpful analyzing algorithm • Understanding algorithm is only goal • Feel free to ignore punctuation & other formalisms • Understanding & analysis is only goal of using this
Pseudo-Code Example Algorithm factorial(intn) returnVariable= 1 while(n > 0) returnVariable=returnVariable*n n=n– 1endwhile returnreturnVariable
“Anything that can go wrong…” • Expresses an algorithm’s complexity • Worst-caseanalysis of algorithm performance • Usually reasonably correlated with execution time • Not always right toconsider only worst-case • May be situation where worst-case is very rare • Solve for other cases similarly, but almost never done
“Should I Even Bother?” • Compare algorithms using big-Oh notation • Could use to compare implementations, also • Saves time implementing all the algorithms • Biases like CPU, typing speed, cosmic rays ignored
Algorithm Analysis • Execution time with n inputs on 4GHz machine:
Algorithm Analysis • Execution time with n inputs on 4GHz machine:
Big-Oh Notation • Want results for large data sets • Nobody caresabout 2 minutes; too small to matter • For this process, only really major details considered • Ignore multipliers • So, O(⅛n) = O(5n) = O(50000n) = O(n) • Multipliers usually implementation-specific • Going from Sun to Mercury v. going to Minneapolis • Ignore lesser terms • So, O(⅚n5 + 23402n2) = O(n5 + n2) = O(n5) • Job 300x older than universe v. 17 minutes?
What is n? • Big-Oh analysis always relative to input size • But determining input size is not always clear • Quick rules of thumb: • Need to consider what algorithm is processing Analyze values below x: n= xAnalyze data in an array: n= size of arrayAnalyze 2 arrays: n= sum of array sizes
Analyzing an Algorithm • Big-Oh computes primitive operations executed • Assignments • Calling a method (but NOT executing the method) • Performing arithmetic operation • Comparing two values • Getting entry from an array • Following a reference • Returning a value from a method • Accessing a field
Primitive Statements • Basis of programming, take constant time:O(1) • Fastest possible big-Oh notation • Time to run sequence of primitive statements, too • But only if the input does not affect sequence Ignore constant multiplier O(5) = O(5* 1) = O(1)
Simple Loops for (inti = 0; i < n.length; i++){} -or- while (i < n) { i++; } • Each loop executed ntimes • Primitive statements only within body of loop • Big –oh complexity of single loop iteration: O(1) • Either loop runs O(n) iterations • So loop has O(n) * O(1) = O(n) complexity total
Loops In a Row for (inti = 0; i < n.length; i++){} inti = 0; while (i < n) { i++; } Add complexities of sequences to compute total For this example, total big-Oh complexity is: = O(n) + O(1) + O(n) = O(2 * n + 1) = O(n + 1)
Loops In a Row for (inti = 0; i < n.length; i++){} inti = 0; while (i < n) { i++; } Add complexities of sequences to compute total For this example, total big-Oh complexity is: = O(n) + O(1) + O(n) = O(2 * n + 1) = O(n + 1)
Loops In a Row for (inti = 0; i < n.length; i++){} inti = 0; while (i < n) { i++; } Add complexities of sequences to compute total For this example, total big-Oh complexity is: = O(n) + O(1) + O(n) = O(2 * n + 1) = O(n)
More Complicated Loops for (inti = 0; i < n; i+= 2) { } i 0, 2, 4, 6, ..., n • In above example, loop executes n/2iterations • Iterations takes O(1)time, so total complexity: = O(n/2) * O(1) = O(n * ½*1) = O(n)
Really Complicated Loops for (inti = 1; i < n; i*= 2) { } i 1, 2, 4, 8, ..., n • In above code, loop executes log2niterations • Iterations takes O(1)time, so total complexity: = O(log2n) * O(1) = O(log2n * 1) = O(log2n)
Really Complicated Loops for (inti = 1; i < n; i*= 3) { } i 1, 3, 9, 27, ..., n • In above code, loop executes log3niterations • Iterations takes O(1)time, so total complexity: = O(log3n) * O(1) = O(log3n * 1) = O(log3n)
Math Moment • All logarithms are related, no matter the base • Change base for an answer using constant multiple • But ignore constant multiple using big-Oh notation • So can consider allO(log n)solutions identical
Nested Loops for (inti = 0; i < n; i++){ for (int j = 0; j < n; j++) { } } • Program would execute outer loop ntimes • Inner loop run n timeseach iteration of outer loop • O(n)iterations doing O(n)work each iteration • So loop has O(n) * O(n) = O(n2) complexity total • Loops complexity multiplies when nested
+ • Only care about approximates on huge data sets • Ignore constant multiples • Drop lesser terms (&n! > 2n > n5 > n2 > n > log n > 1) • O(1)time for primitive statements to execute • Change by constant amount in loop: O(n)time • O(log n)time if multiply by constant in loop • Ignore constants: does not matter what constant is • When code is sequential, addtheir complexities • Complexities are multiplied when code is nested
Your Turn • Get into your groups and complete activity
For Next Lecture • Read 2.4 for class on Wednesday • How do we go about proving big-Oh calculations • Week #5 weekly assignment due Tuesday • Get started soon; I will be leaving at 12:30 today • Midterm #1 in class on Friday; start studying • Think of questions to ask; may have review time Wed.