930 likes | 946 Views
Learn about analyzing time efficiency in algorithms through theoretical and empirical approaches. Understand best-case, worst-case, and average-case scenarios and the order of growth. Explore big-oh, big-omega, and big-theta notation for asymptotic growth rates.
E N D
Chapter 2 Analysis Framework 2.1 – 2.2
Homework • Remember questions due Wed. • Read sections 2.1 and 2.2 • pages 41-50 and 52-59
Agenda: Analysis of Algorithms • Blackboard • Issues: • Correctness • Time efficiency • Space efficiency • Optimality • Approaches: • Theoretical analysis • Empirical analysis
Time Efficiency Time efficiency is analyzed by determining the number of repetitions of the basic operation as a function of input size
input size running time Number of times basic operation is executed execution time for basic operation Theoretical analysis of time efficiency • Basic operation: the operation that contributes most towards the running time of the algorithm. T(n) ≈copC(n)
Empirical analysis of time efficiency • Select a specific (typical) sample of inputs • Use physical unit of time (e.g., milliseconds) OR • Count actual number of basic operations • Analyze the empirical data
Best-case, average-case, worst-case For some algorithms efficiency depends on type of input: • Worst case: W(n) – maximum over inputs of size n • Best case: B(n) – minimum over inputs of size n • Average case: A(n) – “average” over inputs of size n • Number of times the basic operation will be executed on typical input • NOT the average of worst and best case • Expected number of basic operations repetitions considered as a random variable under some assumption about the probability distribution of all possible inputs of size n
Example: Sequential search • Problem: Given a list of n elements and a search key K, find an element equal to K, if any. • Algorithm: Scan the list and compare its successive elements with K until either a matching element is found (successful search) of the list is exhausted (unsuccessful search) • Worst case • Best case • Average case
Types of formulas for basic operation count • Exact formula e.g., C(n) = n(n-1)/2 • Formula indicating order of growth with specific multiplicative constant e.g., C(n) ≈ 0.5 n2 • Formula indicating order of growth with unknown multiplicative constant e.g., C(n) ≈cn2
Order of growth • Most important: Order of growth within a constant multiple as n→∞ • Example: • How much faster will algorithm run on computer that is twice as fast? • How much longer does it take to solve problem of double input size? • See table 2.1
Day 4: Agenda • O, Θ, Ω • Limits • Definitions • Examples • Code O(?)
Homework • Due on Monday 11:59PM • Electronic submission see website. • Try to log into Blackboard • Finish reading 2.1 and 2.2 • page 60-61, questions 2, 3, 4, 5, & 9
Asymptotic growth rate A way of comparing functions that ignores constant factors and small input sizes • O(g(n)): class of functions t (n) that grow no faster than g(n) • Θ(g(n)): class of functions t (n) that grow at same rate as g(n) • Ω(g(n)): class of functions t (n) that grow at least as fast as g(n) see figures 2.1, 2.2, 2.3
Using Limits if t(n) grows slower than g(n) if t(n) grows at the same rate as g(n) if t(n) grows faster than g(n)
Examples: • 10n vs. 2n2 • n(n+1)/2 vs. n2 • logb n vs. logc n
Definition • f(n) = O( g(n) ) if there exists • a positive constant c and • a non-negative interger n0 • such thatf(n)c·g(n) for every n > n0 Examples: • 10n is O(2n2) • 5n+20 is O(10n)
Non-recursive algorithm analysis Analysis Steps: • Decide on parameter n indicating input size • Identify algorithm’s basic operation • Determine worst, average, and best case for input of size n • Set up summation for C(n) reflecting algorithm’s loop structure • Simplify summation using standard formulas
Example for (x = 0; x < n; x++) a[x] = max(b[x],c[x]);
Example for (x = 0; x < n; x++) for (y = x; y < n; y++) a[x][y] = max(b[x],c[y]);
Example for (x = 0; x < n; x++) for (y = 0; y < n/2; y++) for (z = 0; z < n/3; z++) a[z] = max(a[x],c[y]);
Example y = n while (y > 0) if (a[y--] == b[y--]) break;
Day 5: Agenda • Go over the answer to hw2 • Try electronic submission • Do some problems on the board • Time permitting: Recursive algorithm analysis
Homework • Remember to electronically submit hw3 before Tues. morning • Read section 2.3 thoroughly!pages 61-67
Day 6: Agenda • First, what have we learned so far about non-recursive algorithm analysis • Second, logbn and bnThe enigma is solved. • Third, what is the deal with Abercrombie & Fitch • Fourth, recursive analysis tool kit.
Homework 4 and Exam 1 • Last homework before Exam 1 • Due on Friday 2/6 (electronic?) • Will be returned on Monday 2/9 • All solutions will be posted next Monday 2/9 • Exam 1 Wed. 2/11
Homework 4 • Page 68 questions 4, 5 and 6 • Pages 77 and 78 questions 8 and 9 • We will do similar example questions all day on Wed 2/4
What have we learned? • Non-recursive algorithm analysis • First, Identify problem size. • Typically, • a loop counter • an array size • a set size • the size of a value
Basic operations • Second, identify the basic operation • Usually a small block of code or even a single statement that is executed over and over. • Sometimes the basic operation is a comparison that is hidden inside of the loop • Example: • while (target != a[x]) x++;
Single loops • One loop from 1 to n • O(n) • Be aware this is the same as • 2 independent loops from 1 to n • c independent loops from 1 to n • A loop from 5 to n-1 • A loop from 0 to n/2 • A loop from 0 to n/c
Nested loops for (x = 0; x < n; x++) for (y = 0; y < n; y++) O(n2) for (x = 0; x < n; x++) for (y = 0; y < n; y++) for (z = 0; z < n; z++) O(n3) But remember: We can have c independent nested loops, or the loops can be terminated early n/c.
Most non-recursive algorithms reduce to one of these efficiency classes
What else? • Best cases often arise when loops terminate early for specific inputs. • For worst cases, consider the following: Is it possible that a loop will NOT terminate early • Average cases are not the mid-pointer or average of the best and worst case • Average cases require us to consider a set of typical input • We won’t really worry too much about average case until we hit more difficult problems.
Anything else? • Important questions you should be asking? • How does log2n arise in non-recursive algorithms? • How does 2n arise? • Dr. B. Please, show me the actual loops that cause this!
for (x = 0; x < n; x++) { Basic operation; x = x*2; } for every item in the list do Basic operation; eliminate or discard half the items; Log2n • Note: These types of log2n loops can be nested inside of O(n), O(n2), or O(nk) loops, which leads to • n log n • n2 log n • nk log n
Log2n • Which, by the way, is pretty much equivalent to logbn, ln, or the magical lg(n) • But, don’t be mistaken • O(n log n) is different than O(n) even though log n grows so slow.
Last thing: 2n • How does 2n arise in real algorithms? • Lets consider nn: • How can you program n nested loops • Its impossible right? • So, how the heck does it happen in practice?
Think • Write an algorithm that prints “*” 2n times. • Is it hard? • It depends on how you think.
Recursive way fun(int x) { if (x > 0) { print “*”; fun(x-1); fun(x-1); } } Then call the function fun(n);
Non-recursive way for (x = 0; x < pow(2,n); x++) print “*”; WTF? Be very wary of your input size. If you input size is truly n, then this is truly O(2n) with just one loop.
That’s it! • That’s really it. • That’s everything I want you do know about non-recursive algorithm analysis • Well, for now.
Enigma • We know that log2n = (log3n) • But what about 2n = (3n) • Here is how I will disprove it… • This gives you an idea of how I like to see questions answered.
Algorithm Analysis Recursive Algorithm Toolkit
Example Recursive evaluation of n ! • Definition: n ! = 1*2*…*(n-1)*n • Recursive definition of n!: if n=0 then F(n) := 1 else F(n) := F(n-1) * n return F(n) • Recurrence for number of multiplications to compute n!: