Sei sulla pagina 1di 26

ANALYSIS OF

ALGORITHMS
Introducing Some Limited Concepts about Algorithms

Outline
Efficiency of Algorithms
Examples
Order of Growth
Worst, Average and Best Case Analysis

Efficiency of Algorithms
Two fundamental questions:
How fast will my program run? (Time Complexity)
How much memory it will take?

Time Complexity can be measured by the number


of operations performed by the algorithm
We will assume that all operations take the same
amount of time.

Examples
Fahrenheit to Celsius Converter (code)

Examples
Fahrenheit to Celsius Converter (analysis)
Three arithmetic operations
One subtraction, once multiplication, and one division

Assuming all operations take equal time, we can


generalize and say we have three operations

Examples
Adding Array Elements (code)

Examples
Adding Array Elements (analysis)
For an array of size N, how many operations are roughly executed?
Statement

Frequency of Execution

int sum = 0

int i= 0 1

i < list.length

i++

sum=sum+list(i)

Total

=3N+2

The total number of operations clearly depends on input size


We will not obsess about the small details(constant factors)

Examples
Adding Square Matrices(code)
For an N N matrix, how many operations are
roughly executed?

Examples
Adding Square Matrices(analysis)
Statement

Frequency of Execution

dbouble[][] sum...

int i= 0 1

i < list.length

i++

int j = 0

j< list.length

N2

j++

N2

sum[i][j] = a[i][j] + b[i][j]

N2

Total

=3N2+ 3N + 2

Order of Growth
Effect of Problem Size
Only large problem sizes are worth studying!
Modern computers are very fast.
Operations on an input size of a few hundreds will
usually complete in under a minute

Our analysis of algorithm focuses on how an


algorithm scales with problem sizes.

Order of Growth
Growth Rate of Functions
Consider: f (n) = n2 + 100n + log (n) + 1000
Let us consider how this function behaves for small
and large values of n

Order of Growth
Growth Rate of Functions(cont'd)
For the previous function, f (n) = n2 + 100n + log (n) +
1000, we can see
For large n, f (n)
n2
This enables us to use the approximation: f (n) n2, for a
sufficiently large n
This simplification worked despite the large coefficient in front of
n

Generally, for most functions, lower order terms become less


relevant as the value of n gets larger
This "asymptotic" simplification is very useful for analyzing
algorithms
Remember that only large problems are worth studying

Order of Growth
Big-O Notation
Instead of saying an algorithm takes 3n2 + 4n + 2
steps, its simpler to leave out lower order terms such
as 4n and 2, and even the coefficient 3 and say the
growth rate is O(n2) (pronounced as big-oh of n2)
Big-O is the most widely used asymptotic notation in
algorithm analysis
Will be used throughout the course

Order of Growth
Big-O Notation - Common Sense Approach
Big-O enables us to specify how an algorithm's performance
scales with input size
We will discard lower order terms since they are dominated
na dominates nb if a> b: for example n3 dominates n2
any exponential dominates any polynomial: 3n dominates 5n
any polynomial dominates any logarithm: n dominates (log (n))3

Multiplicative constants can be omitted: 14n2 becomes n2


Constant terms can be important(especially if they are large).
But: We cannot easily study algorithms without the simplicity
afforded by big-O notation.
Constant terms depend heavily on the hardware architecture - so
its hard to come up with an accurate number

Order of Growth
Big-O Notation - Formal Definition
Denition Let T(n) and f (n) be functions from positive
integers to positive reals. We say T(N) = O(f (N))
(which means T(N) grows no faster than f (n)) if
there are positive constants c and n0 such that T(N)
= cf(N) when N n0.

Order of Growth
Growth Rate Comparisons
If we assume a given operation takes one
nanosecond(10-9) on some computer, we can
construct the following table

Order of Growth
What do we mean by a fast algorithm?
We cannot simply say a fast algorithm is one that runs
quickly when given a real input instance
This definition depends on the speed of the computer algorithms run fast on some machines and slowly on
other machines
Different algorithms behave differently on different
input sizes
Definition
An algorithm if fast if it is growth rate is O(n) or less.

A somewhat standard denition we will follow

Worst, Average and Best Case


Sequential Search (code)

Worst, Average and Best Case


Sequential Search (analysis)
Our analysis depends on the input
Best Case: O(1)
If the item being searched is found at the rst position
This happens only once in a while

Worst Case: O(N)


If the item being searched is found at the end of the list,or NOT found at
all
The is the upper bound

Average Case
Hard to determine analytically unless we know the type of input we are
processing
Reflects the actual situation

Worst, Average and Best Case


Binary Search (what is it?)
A search algorithm that works only on sorted lists.
Similar to how you would search for a word in the
dictionary
A very widely used search algorithm

Worst, Average and Best Case


Binary Search (the big idea)

[Image taken from: Sedwick and Wayne, Algorithms, Fourth ed.]

Worst, Average and Best Case


Binary Search (code)

Worst, Average and Best Case


Binary Search (analysis)
Our analysis depends on the input
Best Case: O(1)
If the item being searched is found exactly at the middle
This happens only once in a while

Worst Case: O(log2(n)) - we will discuss this shortly


If the item being searched is found at the end of the list, or NOT found
at all
The is the upper bound

Average Case
Hard to determine analytically unless we know the type of input we are
processing
Reflects the actual situation

Worst, Average and Best Case


Binary Search (worst case analysis)
The meaning of log2(n) = m: if you divide n by 2
over and over, m times, you get 1.

Worst, Average and Best Case


Binary Search (worst case analysis) - cont'd
In binary search, we keep on reducing the search domain by a factor of
two
If the initial input has N elements:
We have N/2 items to be searched in second iteration, N/22 iterations in the
third iteration, N/23 in the fourth iteration...
In the worst case, the search goes on until one element is left in the list on, say,
the Mth iteration

Based on our previous observation of the logarithm, for an input size of N


and a total of M iterations(after which only one item is left), we have the
relation ship

Worst, Average and Best Case


Worst, Average, and Best Case
Best Case
Considers cases that are unlikely. Not used much.
Indicates the lower bound on the growth.

Average Case: hard to evaluate, but very useful


Worst Case:
Very useful since it tells us the upper bound on the growth
rate
Much easier to calculate than average case - will be used
very often.

Potrebbero piacerti anche