Sei sulla pagina 1di 8

Certified

JAVA Programmer

Asymptotic Analysis
Kinds of Analysis

Worst-case running time of an algorithm

The longest running time for any input of size n


An upper bound on the running time for any input
guarantee that the algorithm will never take longer
Example: Sort a set of numbers in increasing order; and the data is
in decreasing order
The worst case can occur fairly often
E.g. in searching a database for a particular piece of information

Best-case running time

sort a set of numbers in increasing order; and the data is already in


increasing order

Average-case running time

May be difficult to define what average means

Sohail IMRAN

m a

Asymptotic Notations
Asymptotic Notations , O, , o,
We use to mean order exactly,
Big-O order at most:
O (f(N) ) is bounded from above at most!
Big-Omega order at least:
(f(N) ) is bounded from below at least!
Little-o tight upper bound:
o (f(N) ) is bounded from above smaller than!
Little-omega tight lower bound:
(f(N) ) is bounded from below greater than!
Theta: (f(N) ) bounded, above & below equal to!
Define a set of functions: which is in practice used to compare
two function sizes.
Sohail IMRAN

m a

Complexity
Example:

T(n) = 2n+4

Dominant Term: As n gets large enough, the dominant


term has a much larger effect on the running time of an
algorithm
Example 2:
T(n) = 2n3 + 1000n2 + 100000
Example 1:
T(n) = 2n+4
Example 2:
T(n) = 2n3 + 1000n2 + 100000
Remove negligible terms
Example 1:
n
Example 2:
n3
Sohail IMRAN

m a

Dominant Term & Big-O


Example 1:
Example 2:

n
n3

We say that the algorithm runs


Ex1: in the Order of n
= O(n)
Ex2: in the Order of n3
= O(n3)

O(n3): reads order n-cubed or Big-Oh n-cubed


O(n) : The Time Complexity in the worst case,
grows with the size of the problem
Sohail IMRAN

m a

Some rules for big-oh

Ignore the lower order terms


Ignore the coefficients of the highest-order term
No need to specify the base of logarithm
Changing the base from one constant to another
changes the value of the logarithm by only a constant
factor

If T1(N) = O(f(N) and T2(N) = O(g(N)),


T1(N) + T2(N) = max( O(f(N)), O(g(N)) ),
T1(N) * T2(N) = O( f(N) * g(N) )

Sohail IMRAN

m a

Asymptotic Complexity
Example:
T(n) = 2n3
T(n) = 1000n2
Which one is better?
For large values of n or
small values of n

Sohail IMRAN

m a

General Rules
Rule One: Loops
The running time of a loop is at most the running time of the statements inside the
loop, multiplied by the number of iterations.

Example:
for (i = 0; i < n; i++)
// n iterations
A[i] = (1-t)*X[i] + t*Y[i]; // 12 time units
// per iteration
(Retrieving X[i] requires one addition and one memory access, as does retrieving
Y[i]; the calculation involves a subtraction, two multiplications, and an addition;
assigning A[i] requires one addition and one memory access; and each loop
iteration requires a comparison and either an assignment or an increment, thus
totals twelve primitive operations.)

Thus, the total running time is 12n time units, i.e.,


this part of the program is O(n).

Sohail IMRAN

m a

General Rules (contd)


Rule Two: Nested Loops
The running time of a nested loop is at most the running time of the statements
inside the innermost loop, multiplied by the product of the number of iterations of
all of the loops.

Example:
for (i = 0; i < n; i++)
C[i,j] = j*A[i] + i*B[j];

// n iterations, 2 ops each


// 10 time units/iteration

for (j = 0; j < n; j++)

// n iterations. 2 ops each

(2 for retrieving A[i], 2 for retrieving B[j], 3 for the RHS arithmetic, 3 for assigning C[i,j].)

Total running time: ((10+2)n+2)n = 12n2+2n time units, which is O(n2).

Sohail IMRAN

m a

General Rules (contd..)


Often the loops are more complex.
int sum=0;
for(i=0;i<N;i++)
for(j=i;j<N;j++);
sum++;
Here the inner loop executes N times, N-1 times, ... , 2
times, 1 time.
Sum=N + N-1 + ... + 2 + 1
=(N-1) * N/2 = N2/2 + N/2
This is less than the N2 in our previous example, but the
difference is not huge.
Ignoring the constants and lower order terms, we say both
algorithms are O(N2) because roughly N2 operations are
required.
Sohail IMRAN

10

m a

General Rules (contd)


Rule Three: Consecutive Statements
The running time of a sequence of statements is merely the sum of the
running times of the individual statements.

Example:
for (i = 0; i < n; i++) {
A[i] = (1-t)*X[i] + t*Y[i];
B[i] = (1-s)*X[i] + s*Y[i];
}
for (i = 0; i < n; i++)
for (j = 0; j < n; j++)
C[i,j] = j*A[i] + i*B[j];

// 22n time units


// for this
// entire loop
// (12n+2)n time
// units for this
// nested loop

Total running time: 12n2+24n time units, i.e., this code is O(n2).

Sohail IMRAN

11

m a

General Rules (contd)


Rule Four: Conditional Statements
The running time of an if-else statement is at most the running time of the
conditional test, added to the maximum of the running times of the if and else
blocks of statements.
Example:
if (amt > cost + tax){
//2 time units
count = 0;
//1 time unit
while ((count<n) && (amt>cost+tax)){ //4 TUs per iter
//At most n iter
amt -= (cost + tax);
//3 time units
count++;
//2 time units
}
cout << CAPACITY: << count;
//2 time units
}
else
cout << INSUFFICIENT FUNDS;
//1 time unit

Total running time: 2 + max(1 + (4 + 3 + 2)n + 2, 1) = 9n + 5 time units,


i.e., this code is O(n).
Sohail IMRAN

12

m a

Rule Five: Divide and Conquer


Algorithms that use divide and conquer (like binary search) can
execute a loop in less than N steps.
int num=N;
while(num>0)
num=num/2; // work
Here the loop will execute O(log2N) times.
Complex example:
Another example combines a O(N) loop with a O(log2n) loop.
for(i=0;i<N;i++) {
int num=N;
while(num>0)
num=num/2; // work
}

This code will execute the work step O(Nlog2N) times.


Sohail IMRAN

13

m a

Comparison of different Time Complexity Functions

O(log2N) < O(N) < O(Nlog2N) < O(N2)

Sohail IMRAN

14

m a

Example
Begin
Initialize n
1
For i=0 to n
n+2
For j=0 to m (n+1)(m+2) = nm+n+m+2
Print I
(n+1)(m+1)(1) = nm+n+m+1
End For
End For
End
Total
T(nm) = 2nm + 3n + 2m + 6
T(n) = 2n2 + 5n + 6 (Let nm)
O(n) = n2
Sohail IMRAN

15

m a

Big Oh: more examples

N2 / 2 3N = O(N2)
1 + 4N = O(N)
7N2 + 10N + 3 = O(N2) = O(N3)
log10 N = log2 N / log2 10 = O(log2 N) = O(log N)
sin N = O(1); 10 = O(1), 1010 = O(1)

i 1

i 1

i N N O( N 2 )

i 2 N N 2 O( N 3 )

log N + N = O(N)
logk N = O(N) for any constant k
N = O(2N), but 2N is not O(N)
210N is not O(2N)

Sohail IMRAN

16

m a

Potrebbero piacerti anche