400 likes | 532 Views
CSCE 2100: Computing Foundations 1 Running Time of Programs. Tamara Schneider Summer 2013. What is Efficiency? . Time it takes to run a program? Resources Storage space taken by variables Traffic generated on computer network Amount of data moved to and from disk.
E N D
CSCE 2100: Computing Foundations 1Running Time of Programs Tamara Schneider Summer 2013
What is Efficiency? • Time it takes to run a program? • Resources • Storage space taken by variables • Traffic generated on computer network • Amount of data moved to and from disk
Summarizing Running Time • Benchmarking • Use of benchmarks: small collection of typical inputs • Analysis • Group input based on size • Running time is influenced by various factors • Computer • Compiler
Running Time • worst-case running time: maximum running time over all inputs of size • average running time: average running time of all inputs of size • best-case running time: minimum running time over all inputs of size
Running Time of a Program is the running time of a program as a function of the input size . • indicates that the running time is linearly proportional to the size of the input, that is, linear time.
Running Time of Simple Statements We assume that “primitive operations” take a single instruction. • Arithmetic operations (+, %, *, -, ...) • Logical operations (&&, ||, ...) • Accessing operations (A[i], x->y, ...) • Simple assignment • Calls to library functions (scanf, printf, ... )
Code Segment 1 sum = 0; for(i=0; i<n; i++) sum++; 1
Code Segment 1 sum = 0; for(i=0; i<n; i++) sum++; 1 1
Code Segment 1 sum = 0; for(i=0; i<n; i++) sum++; 1 1 + (n+1)
Code Segment 1 sum = 0; for(i=0; i<n; i++) sum++; 1 + n 1 + (n+1) = 2n+2
Code Segment 1 sum = 0; for(i=0; i<n; i++) sum++; 1 + n 1 + (n+1) = 2n+2 1 How many times?
Code Segment 1 sum = 0; for(i=0; i<n; i++) sum++; 1 + n 1 + (n+1) = 2n+2 1 How many times? 1 + (2n+2) + n*1 = 3n + 3 Complexity?
Code Segment 2 sum = 0; for(i=0; i<n; i++) for(j=0; j<n; j++) sum++;
Code Segment 2 sum = 0; for(i=0; i<n; i++) for(j=0; j<n; j++) sum++; 1
Code Segment 2 sum = 0; for(i=0; i<n; i++) for(j=0; j<n; j++) sum++; 1 2n+2
Code Segment 2 sum = 0; for(i=0; i<n; i++) for(j=0; j<n; j++) sum++; 1 2n+2 2n+2
Code Segment 2 sum = 0; for(i=0; i<n; i++) for(j=0; j<n; j++) sum++; 1 2n+2 2n+2 1
Code Segment 2 sum = 0; for(i=0; i<n; i++) for(j=0; j<n; j++) sum++; 1 2n+2 2n+2 1 1 + (2n+2) + (2n+2)*n + n*n*1 Complexity?
Code Segment 3 sum = 0; for(i=0; i<n; i++) for(j=0; j<n*n; j++) sum++; 1 2n+2 ? 1 Complexity?
Code Segment 4 sum = 0; for(i=0; i<=n; i++) for(j=0; j<i; j++) sum++; 1 2n+4 ? 1 Complexity? i=0 i=1 j=0 i=2 j=0 j=1 i=3 j=0 j=1 j=2 … i=n j=0 j=1 j=2 j=3 . . . j=n-1
How Do Running Times Compare? 3*2n n2 3n-1 n-1
Towards “Big Oh” t (time) c f(n), e.g. 5 x2 with c = 5, f(n)=x2 T(n) describes the runtime of some program, e.g. T(n) = 2x2-4x+3 n (input size) n0 We can observe that for an input size n ≥ n0 , the graph of the function c f(n) has a higher time value than the graph for the function T(n). For n ≥ n0, c f(n) is an upper bound on T(n), i.e. c f(n) ≥ T(n).
Big-Oh [1] • It is too much work to use the exact number of machine instructions • Instead, hide the details • average number of compiler-generated machine instructions • average number of instructions executed by a machine per second • Simplification • Instead of 4m-1 write O(m) • O(m) ?!
Big-Oh [2] • Restrict argument to integer • is nonnegative for all Definition: is if ∃ an integer and a constant : ∃ “there exists” ∀ “for all”
Big-Oh - Example [1] Definition: T(n) is O(f(n)) if ∃ an integer n0 and a constant c > 0: ∀ n ≥ n0 T(n) ≤ cf(n) • Example 1: • T(0) = 1 • T(1) = 4 • T(2) = 9 • in general : T(n) = (n+1)2 • Is T(n) also O(n2) ???
Big-Oh - Example [2] Definition: T(n) is O(f(n)) if ∃ an integer n0 and a constant c > 0: ∀ n ≥ n0 T(n) ≤ cf(n) • T(n)=(n+1)2. We want to show that T(n) is O(n2). • In other words, f(n) = n2 • If this is true, there exist and integer n0 and a constant c > 0 such that for all n ≥ n0: T(n) ≤ cn2
Big-Oh - Example [3] Definition: T(n) is O(f(n)) if ∃ an integer n0 and a constant c > 0: ∀ n ≥ n0 T(n) ≤ cf(n) • T(n) ≤ cn2 ⇔ (n+1)2 ≤ cn2 • Choose c=4, n0=1:Show that (n+1)2 ≤ 4n2 for n ≥ 1 • (n+1)2 = n2 + 2n + 1 • ≤n2 + 2n2 + 1 • =3n2 + 1 • ≤3n2 + n2 • =4n2 • =cn2
Big-Oh - Example [Alt 3] Definition: T(n) is O(f(n)) if ∃ an integer n0 and a constant c > 0: ∀ n ≥ n0 T(n) ≤ cf(n) • T(n) ≤ cn2 ⇔ (n+1)2 ≤ cn2 • Choose c=2, n0=3:Show that (n+1)2 ≤ 2n2for n ≥ 3 • (n+1)2 = n2 + 2n + 1 • ≤n2 + n2 • = 2n2 • = cn2 For all n≥3: 2n+2 ≤ n2
Simplification Rules for Big-Oh • Constant factors can be omitted • O(54n2) = O(n2) • Lower-oder terms can be omitted • O(n4 + n2) = O(n4) • O(n2) + O(1) = O(n2) • Note that the highest-order term should never be negative. • Lower order terms can be negative. • Negative terms can be omitted since they do not increase the runtime.
Transitivity [1] What is transitivity? • if A☺B and B☺C, then A☺C • example: a < b and b < c, then a < c e.g. 2 < 4 and 4 < 7, then 2 < 7 since “<“is transitive Is Big Oh transitive?
Transitivity [2] • if f(n) is O(g(n)) and g(n) is O(h(n))then f(n) is O(h(n)) • f(n) is O(g(n)):∃ n1, c1 such that f(n) ≤ c1 g(n) ∀ n ≥ n1 • g(n) is O(h(n)):∃ n2, c2 such that g(n) ≤ c2 h(n) ∀ n ≥ n2 • Choose n0 = max{n1,n2} and c = c1 c2 f(n) ≤ c1 g(n) ≤ c1 c2 h(n) ⇒ f(n) is O(h(n)) ≤ c2 h(n)
Tightness • Use constant factor “1” • Use tightest upper bound that we can proof • 3n is O(n2) and O(n) and O(2n)Which one should we use?
Summation Rule [1] Consider a program that that contains 2 parts • Part 1 takes T1(n) time and is O(f1(n)) • Part 2 takes T2(n) time and is O(f2(n)) • We also know that f2 grows no faster than f1⇒ f2(n) is O(f1(n)) • What is the running time of the entire program? • T1(n) + T2(n) is O(f1(n) + f2(n)) • But can we simplify this?
Summation Rule [2] • T1(n) + T2(n) is O(f1(n)) since f2 grows no faster than f1 • Proof: T1(n) ≤ c1 f1(n) for n ≥ n1T2(n) ≤ c2 f2(n) for n ≥ n2f2(n) ≤ c3 f1(n) for n ≥ n3 n0= max{n1,n2,n3} T1(n)+ T2(n)≤ c1 f1(n)+ c2 f2(n) = c1 f1(n) + c2f2(n) ≤c1 f1(n) + c2c3 f1(n) = c1 +c2 c3 f1(n) = c f1(n) with c=c1+c2c3⇒ T1(n) + T2(n) is O(f1(n))
Summation Rule - Example //make A identity matrix scanf("%d", &d); for(i=0; i<n; i++) for(j=0; j<n; j++) A[i][j] = 0; for(i=0; i<n; i++) A[i][i] = 1; O(n2) O(1) + O(n2) + O(n) = O(n2)
Summary of Rules & Concepts [1] • Worst-case, average-case, and best-case running time are compared for a fixed input size n, not for varying n! • Counting Instructions • Assume 1 instruction for assignments, simple calculations, comparisons, etc. • Definition of Big-OhT(n) is O(f(n))if ∃ an integer n0 and a constant c > 0: ∀ n ≥ n0 T(n) ≤ cf(n)
Summary of Rules & Concepts [2] • Rule 1: Constant factors can be omitted • Example: O(3n5) = O(n5) • Rule 2: Low order terms can be omitted • Example: O(3n5+ 10n4 - 4n3 + n + 1) = O(3n5) • We can combine Rule 1 and Rule 2: • Example: O(3n5 + 10n4 - 4n3 + n + 1) = O(n5)
Summary of Rules & Concepts [3] • For O(f(n) + g(n)), we can neglect the function with the slower growth rate. • Example: O(f(n) + g(n)) = O(n + nlogn) = O(nlogn) • Transitivity: If f(n) is O(g(n)) and g(n) is O(h(n))then f(n) is O(h(n)) • Example: f(n)=3n, g(n)=n2, h(n)=n63n is O(n2) and n2 is O(n6) 3n is O(n6) • Tightness: We try to find an upper bound Big-Oh that is as small as possible. • Example: n2 is O(n6), but is O(n2) is a much tighter (and better) bound.