Documenti di Didattica
Documenti di Professioni
Documenti di Cultura
• Example: f(n) = 17n; g(n) = 3n2 . Show that f(n) = O(g(n)). • To prove that f(n) = O(g(n)), find an n0 and c such that
Choose c = 6: f(n) ≤ c * g(n) for all n > n0 .
f(n) = 17n ≤ 6 * 3n2 for all n > 0. • We will call the pair (c, n0) a witness pair for proving that
f(n) = O(g(n)).
1
• For asymptotic complexity, base of logarithms does not matter. Why is Asymptotic Complexity So Important?
• Let us show that log2(n) = O(logb(n)) for any b > 1. • Asymptotic complexity gives an idea of how rapidly the space/time
requirements grow as problem size increases.
• Need to find a witness pair (c, n0) such that: • Suppose we have a computing device that can execute 1000 complex
operations per second. Here is the size problem that can be solved in
log2(n) ≤ c * logb (n) for all n > n0. a second, a minute, and an hour by algorithms of different asymptotic
complexity:
• Choose (c = log2 (b), n0 = 0).
Complexity 1 second 1 minute 1 hour
2n 9 15 21
2
Example: Selection Sort Example: Matrix Multiplication
public static void selectionSort(Comparable[] a) { //array of size n int n = A.length; <-- cost = c0, 1 time
for (int i = 0; i < a.length; i++) { <-- cost = c1, n times for (int i = 0; i < n; i++) { <-- cost = c1, n times
int MinPos = i; <-- cost = c2, n times for (int j = 0; j < n; j++) { <-- cost = c2, n*n times
for (int j = i+1; j < a.length; j++) { <-- cost = c3, n*(n-1)/2 times sum = 0; <-- cost = c3, n*n times
if (a[j].compareTo(a[MinPos]) < 0) <-- cost = c4, n*(n-1)/2 times for k = 0; k < n; k++) <-- cost = c4, n*n*n times
MinPos = j;} <-- cost = c5, n*(n-1)/2 times sum = sum + A[i][k]*B[k][j]; <-- cost = c5, n*n*n times
Comparable temp = a[i]; <-- cost = c6, n times C[i][j] = sum; <-- cost = c6, n*n times
a[i] = a[MinPos]; <-- cost = c7, n times }
a[MinPos] = temp;}} <-- cost = c8, n times }
• Once you get the hang of this, you can quickly zero in on what is Recurrence equation:
relevant for determining asymptotic complexity. For example, you can
usually ignore everything that is not in the innermost loop. Why? T(n) = (c0+c1) + 2T(n/2) + (c2*n + c3) <-- recurrence
T(1) = c4 <-- base case
• Main difficulty: estimating running time for recursive programs
How do we solve this recurrence equation?
3
Analysis of Merge-Sort What do we mean by “Solution”
Recurrence equation: • Recurrence: T(n) = 2T(n/2) + n
T(n) = (c0+c1) + 2T(n/2) + (c2*n + c3) • Solution: T(n) = nlog2n
T(1) = c4
• To prove, substitute nlog2 n for T(n) in recurrence:
First, simplify by dropping lower-order terms. T(n) = 2T(n/2) + n
nlog2 n = 2(n/2)log2 (n/2) + n
Simplified recurrence equation: nlog2 n = nlog2(n/2) + n
T(n) = 2T(n/2) + n nlog2 n = n(log2(n) - log2 (2)) + n
T(1) = 1 nlog2 n = n(log2(n) - 1) + n
nlog2 n = nlog2(n) - n + n
It can be shown that T(n) = O(nlog(n)) is a solution to this nlog2 n = nlog2n
recurrence.
c(1) = a
• CS 280, learn a bag of tricks for solving recurrences that c(n) = b + c(n/2) c(n) = O(log(n)) Binary search
arise in practice.
c(1) = a
c(n) = b*n + c(n/2)
c(1) = a
c(n) = b + kc(n/k)
4
Cheat Sheet for Common Recurrences Cheat Sheet for Common Recurrences
Recurrence Relation Closed-Form Example Recurrence Relation Closed-Form Example
c(1) = a c(1) = a
c(n) = b + c(n-1) c(n) = O(n) Linear search c(n) = b + c(n-1) c(n) = O(n) Linear search
c(1) = a c(1) = a
c(n) = b*n + c(n-1) c(n) = O(n2 ) Quicksort c(n) = b*n + c(n-1) c(n) = O(n2 ) Quicksort
c(1) = a c(1) = a
c(n) = b + c(n/2) c(n) = O(log(n)) Binary search c(n) = b + c(n/2) c(n) = O(log(n)) Binary search
c(1) = a c(1) = a
c(n) = b*n + c(n/2) c(n) = O(n) c(n) = b*n + c(n/2) c(n) = O(n)
c(1) = a c(1) = a
c(n) = b + kc(n/k) c(n) = b + kc(n/k) c(n) = O(n)
5
Analysis of Quicksort: Tricky!
public static void quickSort(Comparable[] A, int l, int h) {
if (l < h)
{int p = partition(A,l+1,h,A[l]);
//move pivot into its final resting place;
Comparable temp = A[p-1];
A[p-1] = A[l]; What is wrong with this analysis?
A[l] = temp;
//make recursive calls
quickSort(A,l,p-1);
quickSort(A,p,h);}}
Incorrect attempt:
c(1) = 1
c(n) = n + 2c(n/2)
---- ---------
partition sorting the two partitioned arrays
6
fib(5) Iterative Code for Fibonnaci
fib(4) fib(3) fib(n) = fib(n-1) + fib(n-2); fib(1) = 1; fib(2) = 1
dad = 1;
grandad = 1;
fib(3) fib(2) fib(2) fib(1) current = 1;
for (i = 3; i ≤ 3; i++) {
grandad = dad;
fib(2) fib(1) dad = current;
current = dad + grandad;
}
c(n) = c(n-1) + c(n-2) + 2 return (current);
c(2) = 1; c(1) = 1
• Number of times loop is executed? Less than N.
• Number of computations per loop? Fixed amount of work.
• For this problem, problem size is n. • ==> complexity of iterative algorithm = O(n)
• It can be shown that T(n) = O(2n ). • Much, much, much, much, much, … better than O(2n)!
• Cost of computing fib(n) recursively is exponential in size n.
7
10 50 100 300 1000
n2 nlogn 5n
50 250 500 1500 5000 How long would it take @ 1 instruction / msec ?
33 282 665 2469 9966
10 20 50 100 300
100 2500 10,000 90,000 1,000,000
n2
1/10,000 sec 1/2500 sec 1/400 sec 1/100 sec 9/100 sec
n3
n5
1/10 sec 3.2 sec 5.2 min 2.8 hr 28.1 days
a 16-digit a 31-digit a 91-digit a 302-digit
2n
1024 a 75-digit
number number number number 400 trillion
1/1000 sec 1 sec 35.7 yr number of
2n
centuries
a 65-digit a 161-digit a 623-digit unimagin-ably centuries
3.6 million
n!
10 billion
nn
number number number large centuries centuries centuries