Documenti di Didattica
Documenti di Professioni
Documenti di Cultura
Polynomial-Time
Desirable scaling property. When the input size doubles, the algorithm
should only slow down by some constant factor C.
choose C = 2d
observe the difference between c and C
2
Worst-Case Analysis
Worst case running time. Obtain bound on largest possible running time
of algorithm on input of a given size N.
Generally captures efficiency in practice.
Draconian view, but hard to find effective alternative.
Too severe
Worst-Case Polynomial-Time
Exceptions.
Some poly-time algorithms do have high constants and/or
exponents, and are useless in practice.
Some exponential-time (or worse) algorithms are widely used
because the worst-case instances seem to be rare.
simplex method
Unix grep
4
Why It Matters
6
Asymptotic Order of Growth in Plain English
Big family.
O(g(n)): class of functions f(n) that grow no faster than g(n).
Θ (g(n)): class of functions f(n) that grow at same rate as g(n).
Ω(g(n)): class of functions f(n) that grow at least as fast as g(n).
Small family.
o(g(n)): class of functions f(n) that grow slower than g(n).
ω(g(n)): class of functions f(n) that grow faster than g(n).
Notation
8
Properties
Transitivity.
If f = O(g) and g = O(h) then f = O(h).
If f = Ω(g) and g = Ω(h) then f = Ω(h).
If f = Θ(g) and g = Θ(h) then f = Θ(h).
Additivity.
If f = O(h) and g = O(h) then f + g = O(h).
If f = Ω(h) and g = Ω(h) then f + g = Ω(h).
If f = Θ(h) and g = O(h) then f + g = Θ(h).
Multiplicity
If f = O(h) and g = O(i) then f × g = O(h × i)
If f = Ω(h) and g = Ω(i) then f × g = Ω(h × i)
If f = Θ(h) and g = Θ(i) then f × g = Θ(h × i)
10
Linear Time: O(n)
Linear time. Running time is at most a constant factor times the size
of the input.
max ← a1
for i = 2 to n {
if (ai > max)
max ← ai
}
11
i = 1, j = 1
while (both lists are nonempty) {
if (ai ≤ bj) append ai to output list and increment i
else(ai ≤ bj)append bj to output list and increment j
}
append remainder of nonempty list to output list
12
O(n log n) Time
O(n log n) solution. Sort the time-stamps. Scan the sorted list in
order, identifying the maximum gap between successive time-stamps.
13
Closest pair of points. Given a list of n points in the plane (x1, y1), …,
(xn, yn), find the pair that is closest.
min ← d
}
}
Remark. Ω(n2) seems inevitable, but this is just an illusion. see chapter 5
14
Cubic Time: O(n3)
O(n3) solution. For each pairs of sets, determine if they are disjoint.
foreach set Si {
foreach other set Sj {
foreach element p of Si {
determine whether p also belongs to Sj
}
if (no element of Si belongs to Sj)
report that Si and Sj are disjoint
}
}
15
Independent set of size k. Given a graph, are there k nodes such that
no two are joined by an edge?
k is a constant
16
Exponential Time
S* ← φ
foreach subset S of nodes {
check whether S in an independent set
if (S is largest independent set seen so far)
update S* ← S
}
}
17
Analysis of Algorithms
Issues
Correctness. Most important, algorithm must be correct otherwise
it is useless.
Time efficiency. How much time requires to run the algorithm.
Space efficiency. The extra space the algorithm requires.
– Both time and space were premium resources in the early days.
At the present, the space issue is not of as much concern. The
memory is cheap.
Optimality. (In theory) no algorithm can be more efficient.
Approaches
Theoretical analysis. Mathematical analysis.
Empirical analysis. Through experiments.
18
Empirical Analysis of Time Efficiency
OR
19
20
Example: Sequential Search
Algorithm. Scan the list and compare its successive elements with K
until either a matching element is found (successful search) or the list
is exhausted (unsuccessful search).
Worst case?
Search all elements, no match found.
Best case?
Match at the first comparison.
Average case?
Depends on how often the search key matches the list.
21
Recurrence Relations
The generic terms (n) can be expressed as one or more of other terms
(with lower index number) of the function, with one or more explicit
values of the first terms.
22
Methods for Solving Recurrences
23
Def. n ! = 1*2*…*(n-1)*n
recursive definition of n!
Factorial(n) {
if (n=0)
fac = 1;
else
fac = Factorial(n-1) * n
return fac;
}
24
Example 2: Tower of Hanoi
Tower of Hanoi
Goal. Move all disks to the third peg using second one as an auxiliary, if
needed.
Constraints.
One disk can be move at a time
A larger disk cannot be on top of a smaller one at any time.
25
26
Master Theorem
27
f(n) f(n)
a
... a2 f(n/b2)
f(n/b2) f(n/b2) f(n/b2)
28
Idea of Master Theorem
Θ(nd )
Θ(nd lg n)
Θ(nlogb a )
29
Example 1.
T(n) = T(n/2) + n
T(1) = 1 Î a = 1, b = 2, c = 1, f(n) = n, d = 1.
Match case 1, therefore T(n) = Θ(n1) = Θ(n) = O(n) = Ω(n).
Example 2.
T(n) = 2T(n/2) + n
T(1) = 1 Î a = 2, b = 2, c = 1, f(n) = n, d = 1.
Match case 2, therefore T(n) = Θ(n lg(n)).
Example 3.
T(n) = 7T(n/4) + n
T(1) = 1 Î a = 7, b = 4, c = 1, f(n) = n, d = 1.
Match case 3, therefore T(n) = Θ(nlog4 7)
30
Recursion as Mathematical Induction
Example.
n
n(n + 1)
∑i =
i =1 2
for all n ≥ 1
We need to prove.
1
1(1+ 1) 2
2(2 + 1) 3
3(3 + 1) n
n(n + 1)
∑i =
i =1 2
; ∑i =
i =1 2
; ∑i =
i =1 2
; L;∑i =
i =1 2
31
Mathematical Induction
1
1(1+ 1)
Basis step, prove. ∑i =
i=1 2
⇒1= 1
n+1
n(n + 1) (n + 1)(n + 2) (n + 1)((n + 1) + 1)
∑i =
i =1 2
+ (n + 1) =
2
=
2
32
Recursion Tree
Visualize a recurrence
T(n) = 2T(n/2) + n2
T(n) n2
T(n/2) T(n/2)
33
n2 n2
(1/2)n2
(n/2)2 (n/2)2
n2 n2
(5/16)n2
(n/4)2 (n/2)2
35
n n
n/3 2n/3 n
36
Recursion Tree (cont.)
n n
n/3 2n/3 n
log3/2(n)
37