Sei sulla pagina 1di 90

INTRODUCTION TO ALGORITHMS

An

Algorithm is any well-defined


computational procedure that takes
zero or more values as Input and
produces a set of values or some
value as output. Thus algorithm is a
sequence of computational steps that
transforms the i/p into the o/p.

Formal Definition &


Properties of an Algorithm
An

Algorithm is a finite set of


instructions that, if followed,
accomplishes a particular task. In
addition, all algorithms should
satisfy the following criteria.
Properties of an Algorithm : Zero or more quantities
are externally supplied.
OUTPUT At least one quantity is
INPUT

Development of an
Algorithm
Designing
Validation

Algorithms
of algorithm

Analysing

the Algorithm/Performance

Program proving/pgm verification

Analysis
Testing the program
Debugging
Profiling

Algorithm Specification
Can

describe an algorithm using

Natural language, like English


Graphic representation, called flow
chart
Work well on small and simple algorithm

Pseudo-code Method

Pseudo-code conventions
To

describe algorithms, here we are


using pseudo-code that resembles C
How each instructions can be
described using pseudo-code?
1.Comments:Comments begins with // and continue until the end of
line

2.Blocks are indicated with matching braces


{ and } and statements are delimited by ;
(semicolon)
3.An identifier begin with letter. The data
types of variables are not explicitly declared

1.Assignment

of values to variables is
done by using the assignment
statement
General form:- <variable> = <expression>
2.In

order to produce true/false values


the logical operators and relational
operators can be used.
3.Elements of multidimensional array are
accessed using [ and ]
Eg:- A is a 2-D array, the ( i, j )th element is denoted as
A[i,j]

1.The

following looping statements


are employed:for , while , repeatuntil
General form:for variable= value1 to value2 step step do
{
Statement 1
..
Statement n
}

while <condition> do
{
Statement 1
..
Statement n
}
repeat
Statement 1
..
Statement n

until <condition>

1.A

conditional statement has the following


form
if <condition> then <statement>
if <condition> then <statement1> else <statement2>
2.Input and output done using the instruction

read and write


Eg:- write ( The result,x)
3.Algorithm consist of a
Form of heading:-

heading and a body

Algorithm Name( <parameter list> )

Algorithm to find the largest element


in an array
1.Algorithm Max(A ,n)
2.// A is an array of size
3.{
4.Result := A[i];
5.for i:= 2 to n do
6. if A[i] > Result then
7.
Result :=A[i];
8. return Result;
9.}

Algorithm to find the largest element


n an array
1.Algorithm Max(A ,n)
2.// A is an array of size
3.{
4.Result := A[i];
5.For i:= 2 to n do
6. if A[i] > Result then
7.
Result :=A[i];
8. return Result;
9.}

Algorithm to find the largest element


n an array
1.Algorithm Max(A ,n)
2.// A is an array of size
3.{
4.Result := A[i];
5.for i:= 2 to n do
6. if A[i] > Result then
7.
Result :=A[i];
8. return Result;
9.}

Algorithm to find the largest element


n an array
1.Algorithm Max(A ,n)
2.// A is an array of size
3.{
4.Result := A[i];
5.For i:= 2 to n do
6. if A[i] > Result then
7.
Result :=A[i];
8. return Result;
9.}

Algorithm to find the largest element


n an array
1.Algorithm Max(A ,n)
2.// A is an array of size
3.{
4.Result := A[i];
5.for i:= 2 to n do
6. if A[i] > Result then
7.
Result :=A[i];
8. return Result;
9.}

Algorithm to find the largest element


n an array
1.Algorithm Max(A ,n)
2.// A is an array of size
3.{
4.Result := A[i];
5.for i:= 2 to n do
6. if A[i] > Result then
7.
Result :=A[i];
8. return Result;
9.}

Algorithm to find the largest element


n an array
1.Algorithm Max(A ,n)
2.// A is an array of size
3.{
4.Result := A[i];
5.for i:= 2 to n do
6. if A[i] > Result then
7.
Result :=A[i];
8. return Result;
9.}

Recursive Algorithm
An

algorithm is said to be recursive if


the same algorithm is invoked in the
body.
Recursive algorithms can be either
Direct recursive
Indirect recursive

An algorithm that calls itself is Direct


Recursive.
Algorithm A is said to be Indirect Recursive
if it calls another algorithm which in turns
calls A.

The

Towers of Hanoi problem show


how to develop a recursive
algorithms.
Problem:-

Three vertical towers labeled A , B ,C are


fixed erect on a platform
On tower A, there are n (here n=3) disks
and the disk were of decreasing size
The problem is to move all the n disks on
tower A to B maintaining the same order
of size.
The can be moved only one at a time and
at no time can a disk be on the top of a
smaller disk

Tower A

Tower B

Tower C

AB

Tower A

Tower B

Tower C

AC

Tower A

Tower B

Tower C

BC

Tower A

Tower B

Tower C

AB

Tower A

Tower B

Tower C

CA

Tower A

Tower B

Tower C

CB

Tower A

Tower B

Tower C

AB

Tower A

Tower B

Tower C

1.Algorithm TowersofHanoi(n,x,y,z)
2.//Move the top n disks from tower x to tower y.
3.{
4.
if(n1) then
5.
{
6.
TowersofHanoi(n-1,x,z,y);
7.
Write(move top disk from tower X ,to

top
8.
9.
10.}

of tower ,Y);
TowersofHanoi(n-1,z,y,x);
}

TowersofHanoi (n, x, y, z)

TowersofHanoi (n-1, x, z, y)

TowersofHanoi
(n-1, x, z, y)

TowersofHanoi
(n-1, x, z, y)

TowersofHanoi
(n-1, z, y, x)

TowersofHanoi TowersofHanoi
(n-1, z, y, x)
(n-1, x, z, y)

TowersofHanoi (n-1, z, y, x)

TowersofHanoi
(n-1, x, z, y)

TowersofHanoi
(n-1, z, y, x)

TowersofHanoi
(n-1, z, y, x)

TowersofH
TowersofHanoi (n-1, z, y, x
(n-1, x, z, y)

TowersofHanoi TowersofHanoi
(n-1, x, z, y)
(n-1, z, y, x)

PERFORMANCE ANALYSIS
Space

Complexity:

The space complexity of an algorithm is the


amount of memory it needs to run to
completion.

Time

Complexity:

The time complexity of an algorithm is the


amount of computer time it needs to run to
completion.

Space Complexity:
The

Space needed by each of the


algorithm is seen to be the sum of the
following component.
1.A fixed part that is independent of the
characteristics (eg: number, size)of the inputs
and outputs (instance characteristics). The
part typically includes the
instruction space (ie. Space for the code)
space for simple variable
fixed-size component variables (also called
aggregate)
space for constants, and so on.

1.A

variable part that consists of the space


needed by component variables whose size is
dependent on the particular problem instance
being solved, the space needed by referenced
variables and the recursion stack space.
The

space requirement S(P) of any


algorithm P may therefore be written as,

S(P) = c+ Sp(Instance
characteristics)
where c is a constant.

Example-1 :1.Algorithm
2.{
3. return
4.}
The

abc(a,b,c)

a+b+b*c+(a+b-c)/(a+b) +4.0;

space needed by this algorithm is


independent of the instance
characteristics,
so Sabc = 0 and one word is adequate to
store the values of each of a, b, c.
S(abc) = c+ Sabc

Example-2 :1.Algorithm sum(a, n)


2.{
3.
s=0.0;
4.
for i= 1 to n do
5.
s= s+a[i];
6.
return s;
7.}
The

problem instances for this algorithm are


characterized by n, the number of elements to be
summed.
The space needed by n is one word, since it is
of type integer.

The

space needed by a is the space


needed by variables of type array of
floating point numbers. This is atleast n
words, since a must be large enough to
hold the n elements to be summed.
So, we obtain Ssum(n) (3+n)
n for a[],one each for n, i & s

Example-3 :- (Recursive Algorithm)


1.Algorithm
2.{
3.
4.
5.

Rsum(a, n)

if (n 0) then return 0.0;


else return Rsum(a, n-1)+ a[n];
}

The

recursion stack space includes


space for the formal parameters, the
local variables and the return
address.

Each

call of Rsum requires atleast


3 words
Since the depth of recursion is n
+ 1, the recursion stack space needed is
3(n+1)

Time Complexity:
Two

types of times are associated


with the execution of every program
compile time
run (or execution) time
The compile time does not depend

on the instance characteristics.


The run time of a program depend
on the instance characteristics. This
run time is denoted by tp
The time T(p) taken by a program P
is the sum of the compile time and

Importance

is given for the run time part


only, the reason is that that a compiled
program will be run several times without
recompilation
The execution time can be calculated by
counting the no: of program steps.
A program step is defined as
syntactically and semantically meaningful
segment of a program that has an
execution time that is independent of the
instance characteristics.
The number of steps any problem
statement is assigned depends on the
kind of statement.

For example,
For comment line
0 steps
Assignment statements 1 steps
In iterative statement such as
for, while & repeat-until Control
part of the statement. The step count of
the control part is one.

Determining the no:of steps


There

are two ways to determine the


no: of steps needed by a program.
1.Introduce a variable, count into the program statement
with initial value 0. Each time the a statement in the
original program is executed, count is incremented by
the step count of that statement

2.{
Find the time complexity of
algorithm Sum.

For
finding,
3.s=
0.0;here a count
variable is introduced.

4.count = count+1;
5.for i=1 to n do
6.
7.

count
=count+1;
8.
s=s+a[i];
9.
1.Algorithm Sum(a,
2.{
3.
s=0.0;
4.
for i= 1 to n do
5.
s= s+a[i];
6.
return s;
7. }

n)

count=count+1;
10.
}
11.count=count+1;
12.count=count+1;
13.return s;
14.}

2.{
Find the time complexity of
algorithm Sum.

For
finding,
3.s=
0.0;here a count
variable is introduced.

4.count = count+1;
5.for i=1 to n do
6.
7.

count
=count+1;
8.
s=s+a[i];
9.
1.Algorithm Sum(a,
2.{
3.
s=0.0;
4.
for i= 1 to n do
5.
s= s+a[i];
6.
return s;
7. }

n)

count=count+1;
10.
}
11.count=count+1;
12.count=count+1;
13.return s;
14.}

2.{
Find the time complexity of
algorithm Sum.

For
finding,
3.s=
0.0;here a count
variable is introduced.

4.count = count+1;
5.for i=1 to n do
6.
7.

count
=count+1;
8.
s=s+a[i];
9.
1.Algorithm Sum(a,
2.{
3.
s=0.0;
4.
for i= 1 to n do
5.
s= s+a[i];
6.
return s;
7. }

n)

count=count+1;
10.
}
11.count=count+1;
12.count=count+1;
13.return s;
14.}

2.{
Find the time complexity of
algorithm Sum.

For
finding,
3.s=
0.0;here a count
variable is introduced.

4.count = count+1;
5.for i=1 to n do
6.
7.

count
=count+1;
8.
s=s+a[i];
9.
1.Algorithm Sum(a,
2.{
3.
s=0.0;
4.
for i= 1 to n do
5.
s= s+a[i];
6.
return s;
7. }

n)

count=count+1;
10.
}
11.count=count+1;
12.count=count+1;
13.return s;
14.}

2.{
Find the time complexity of
algorithm Sum.

For
finding,
3.s=
0.0;here a count
variable is introduced.

4.count = count+1;
5.for i=1 to n do
6.
7.

count
=count+1;
8.
s=s+a[i];
9.
1.Algorithm Sum(a,
2.{
3.
s=0.0;
4.
for i= 1 to n do
5.
s= s+a[i];
6.
return s;
7. }

n)

count=count+1;
10.
}
11.count=count+1;
12.count=count+1;
13.return s;
14.}

Find the time complexity of


algorithm Sum.

1.Algorithm Sum(a,
2.{
3.
s=0.0;
4.
for i= 1 to n do
5.
s= s+a[i];
6.
return s;
7. }

n)

For finding, here a count


variable is introduced.

1.Algorithm Sum(a , n)
2.{
3.s= 0.0;
4.count = count+1;
5.for i=1 to n do
6. {
7.
count =count+1;
8.
s=s+a[i];
9.
count=count+1;
10.
}
11.count=count+1;
12.count=count+1;
13.return s;
14.}

Find the time complexity of


algorithm Sum.

For finding, here a count


variable is introduced.

1.Algorithm Sum(a,
2.{
3.
s=0.0;
4.
for i= 1 to n do
5.
s= s+a[i];
6.
return s;
7. }

1.Algorithm Sum(a , n)
2.{
3.s= 0.0;
4.count = count+1;
5.for i=1 to n do
6. {
7.
count =count+1;
8.
s=s+a[i];
9.
count=count+1;
10.
}
FALSE
11.count=count+1; CASE
12.count=count+1;
13.return s;
14.}

n)

Find the time complexity of


algorithm Sum.

1.Algorithm Sum(a,
2.{
3.
s=0.0;
4.
for i= 1 to n do
5.
s= s+a[i];
6.
return s;
7. }

n)

For finding, here a count


variable is introduced.

1.Algorithm Sum(a , n)
2.{
3.s= 0.0;
4.count = count+1;
5.for i=1 to n do
6. {
7.
count =count+1;
8.
s=s+a[i];
9.
count=count+1;
10.
}
11.count=count+1;
12.count=count+1;FOR THE RETURN
STMT
13.return s;
14.}

Find the time complexity of


Recursive algorithm

For finding, here a count


variable is introduced.
1.Algorithm
2.{

1.Algorithm Rsum(a, n)
2.{
3.if (n 0) then return

0.0;
4.else return
Rsum(a, n-1)+
a[n];
5.}

Rsum(a, n)

3.

count = count+1;

4.

if (n 0) then

FOR THE IF
CONDITION

5. {
6.
count =count+1;
7.
return 0.0;
8.
}
9. else
10. {
11. count=count+1;
12.
return Rsum(a, n-1)+ a[n];
13. }
14.}

Find the time complexity of


Recursive algorithm

For finding, here a count


variable is introduced.
1.Algorithm
2.{

1.Algorithm Rsum(a, n)
2.{
3.if (n 0) then return

0.0;
4.else return
Rsum(a, n-1)+
a[n];
5.}

Rsum(a, n)

3.

count = count+1;

4.

if (n 0) then

5. {
6.
count =count+1; FOR THE RET
7.
return 0.0;
8.
}
9. else
10. {
11. count=count+1;
12.
return Rsum(a, n-1)+ a[n];
13. }
14.}

Find the time complexity of


Recursive algorithm

For finding, here a count


variable is introduced.
1.Algorithm
2.{

1.Algorithm Rsum(a, n)
2.{
3.if (n 0) then return

0.0;
4.else return
Rsum(a, n-1)+
a[n];
5.}

Rsum(a, n)

3.

count = count+1;

4.

if (n 0) then

5. {
6.
count =count+1;
7.
return 0.0;
8.
}
9. else
10. {
FOR THE ADDI
FUNCTION INVO
& RETURN
11. count=count+1;
12.
return Rsum(a, n-1)+ a[n];
13. }
14.}

When

analyzing a recursive program for its


step count, obtains a recursive formula for the
step count,

tRSum (n) =

if n=0

2+ tRSum (n -1) if n>0


These recursive formulas are referred to as
recurrence relation

2. The second method to determine the step


count of an algorithm is to build a table in
which we list the total number of steps
contributes by each statement.
First determine the number of steps per execution
(s/e) of the statement and the total number of times (ie.,
frequency) each statement is executed.
By combining these two quantities, the total
contribution of all statements, the step count for the
entire algorithm is obtained.

Statement

1. Algorithm Sum(a,n)
2.{
3.
S=0.0;
4.
for i=1 to n do
5.
s=s+a[i];
6.
return s;
7. }

Total

S/e

0
0
1
1
1
1
0

Frequenc
y

Total

1
n+1
n
1
-

0
0
1
n+1
n
1
0

2n+3

Statement

S/e

1.Algorithm Rsum(a, n)
2.{
3.if (n 0) then
4.
return 0.0;
5.else return
Rsum(a, n-1)+ a[n];
6.}

Total
n=0

n>0

- 0

1
1

1
1

1 1
0 1

1+
x
0

0
-

1 0
- 1+x
0

Total
x=

Frequency
n=0
n>0

tRSum(n-1)

2
2+x

Asymptotic analysis:
Expressing

the complexity in term of its


relationship to know function. This type
analysis is called asymptotic analysis.
Asymptotic notations are used in analyzing
algorithms.

Asymptotic notation:
1.Big

oh: The function f(n)=O(g(n)) iff


there exist positive constants c and n0
such that
f(n) c*g(n) for all n , n n0.
Asymptotic

Upper Bound

Asymptotic notation:
1.Big

oh: The function f(n)=O(g(n)) iff


there exist positive constants c and n0
such that
f(n) c*g(n) for all n , n n0.
Eg1:- 3n+2

Asymptotic notation:
1.Big

oh: The function f(n)=O(g(n)) iff


there exist positive constants c and n0
such that
f(n) c*g(n) for all n , n n0.
Eg1:- 3n+2
f(n)=3n+2

Asymptotic notation:
1.Big

oh: The function f(n)=O(g(n)) iff


there exist positive constants c and n0
such that f(n) c*g(n) for all n , n n0.
Eg1:- 3n+2
f(n)=3n+2 4n for all n2

Asymptotic notation:
1.Big

oh: The function f(n)=O(g(n)) iff


there exist positive constants c and n0
such that
f(n) c*g(n) for all n , n n0.
Eg1:- 3n+2
f(n)=3n+2 4n for all n2
4*n

Asymptotic notation:
1.Big

oh: The function f(n)=O(g(n)) iff


there exist positive constants c and n0
such that
f(n) c*g(n) for all n , n n0.
Eg1:- 3n+2
f(n)=3n+2 4n for all n2
4*n
O(n)

Asymptotic notation:
1.Big

oh: The function f(n)=O(g(n)) iff


there exist positive constants c and n0
such that
f(n) c*g(n) for all n , n n0.
Eg2:- 10n2 +4n+2

Asymptotic notation:
1.Big

oh: The function f(n)=O(g(n)) iff


there exist positive constants c and n0
such that f(n) c*g(n) for all n , n n0.
Eg2:- 10n2 +4n+2
f(n)= 10n2 +4n+2 11n2 for all
n5

Asymptotic notation:
1.Big

oh: The function f(n)=O(g(n)) iff


there exist positive constants c and n0
such that f(n) c*g(n) for all n , n n0.
Eg2:- 10n2 +4n+2
f(n)= 10n2 +4n+2 11n2 for all
n5
= O(n2)

Notation

Computing Time

O(1)
O(n)
O(n2)
O(n3)
O(2n)
O(log n)

Constant
Linear
Quadratic
Cubic
Exponential
Logarithmic

Asymptotic notation:
1.Omega:

The function f(n)=(g(n)) iff


there exist positive constants c and n0
such that
f(n) c*g(n) for all n, n n0.
Asymptotic lower bound
Eg1:- 3n+2

Asymptotic notation:
1.Omega:

The function f(n)=(g(n)) iff


there exist positive constants c and n0
such that
f(n) c*g(n) for all n, n n0.
Eg1:- 3n+2
f(n)=3n+2 3n for all n1

Asymptotic notation:
1.Omega:

The function f(n)=(g(n)) iff


there exist positive constants c and n0
such that
f(n) c*g(n) for all n, n n0.
Eg1:- 3n+2
f(n)=3n+2 3n for all n1
3*n

Asymptotic notation:
1.Omega:

The function f(n)=(g(n)) iff


there exist positive constants c and n0
such that
f(n) c*g(n) for all n, n n0.
Eg1:- 3n+2
f(n)=3n+2 3n for all n1
3*n
(n)

Asymptotic notation:
1.Omega:

The function f(n)=(g(n)) iff


there exist positive constants c and n0
such that
f(n) c*g(n) for all n, n n0.
Eg2:- 10n2 +4n+2

Asymptotic notation:
1.Omega:

The function f(n)=(g(n)) iff


there exist positive constants c and n0
such that
f(n) c*g(n) for all n, n n0.
Eg2:- 10n2 +4n+2
f(n)= 10n2 +4n+2 n2 for all n1
= (n2)

Asymptotic notation:
1.Theta:

The function f(n)=(g(n)) iff


there exist positive constants c1,c2 and
n0 such that
c1g(n) f(n) c2 g(n) for all n, n
no.
Asymptotic tight bound
Eg 1:- 3n+2

Asymptotic notation:
1.Theta:

The function f(n)=(g(n)) iff


there exist positive constants c1,c2 and
n0 such that
c1g(n) f(n) c2 g(n) for all n, n
no.
Eg 1:- 3n+2
f(n)=3n+2

Asymptotic notation:
1.Theta:

The function f(n)=(g(n)) iff


there exist positive constants c1,c2 and
n0 such that
c1g(n) f(n) c2 g(n) for all n, n
no.
Eg 1:- 3n+2
f(n)=3n+2
3n+2 4n for all n2

Asymptotic notation:
1.Theta:

The function f(n)=(g(n)) iff


there exist positive constants c1,c2 and
n0 such that
c1g(n) f(n) c2 g(n) for all n, n
no.
Eg 1:- 3n+2
f(n)=3n+2
3n+2 4n for all n2
3n+2 3n for all n2

Asymptotic notation:
1.Theta:

The function f(n)=(g(n)) iff


there exist positive constants c1,c2 and
n0 such that
c1g(n) f(n) c2 g(n) for all n, n
no.
Eg 1:- 3n+2
f(n)=3n+2
3n+2 4n for all n2
3n+2 3n for all n2
3n 3n+2 4n

Asymptotic notation:
1.Theta:

The function f(n)=(g(n)) iff


there exist positive constants c1,c2 and
n0 such that
c1g(n) f(n) c2 g(n) for all n, n
no.
Eg 1:- 3n+2
f(n)=3n+2
3n+2 4n for all n2
3n+2 3n for all n2
3n 3n+2 4n
= (n)

Asymptotic notation:
1.Theta:

The function f(n)=(g(n)) iff


there exist positive constants c1,c2 and
n0 such that
c1g(n) f(n) c2 g(n) for all n, n
no.
Eg 2:- 10n2 +4n+2

Asymptotic notation:
1.Theta:

The function f(n)=(g(n)) iff


there exist positive constants c1,c2 and
n0 such that
c1g(n) f(n) c2 g(n) for all n, n
no.
Eg 2:- 10n2 +4n+2
f(n)= 10n2 +4n+2

Asymptotic notation:
1.Theta:

The function f(n)=(g(n)) iff


there exist positive constants c1,c2 and
n0 such that
c1g(n) f(n) c2 g(n) for all n, n
no.
Eg 2:- 10n2 +4n+2
f(n)= 10n2 +4n+2
10n2 +4n+2 n2 for all
n5

Asymptotic notation:
1.Theta:

The function f(n)=(g(n)) iff


there exist positive constants c1,c2 and
n0 such that
c1g(n) f(n) c2 g(n) for all n, n
no.
Eg 2:- 10n2 +4n+2
f(n)= 10n2 +4n+2
10n2 +4n+2 n2 for all
n5
10n2 +4n+2 11n2 for
all n5

Asymptotic notation:
1.Theta:

The function f(n)=(g(n)) iff


there exist positive constants c1,c2 and
n0 such that
c1g(n) f(n) c2 g(n) for all n, n
no.
Eg 2:- 10n2 +4n+2
f(n)= 10n2 +4n+2
10n2 +4n+2 n2 for all
n5
10n2 +4n+2 11n2 for
all n5

Asymptotic notation:
1.Theta:

The function f(n)=(g(n)) iff


there exist positive constants c1,c2 and
n0 such that
c1g(n) f(n) c2 g(n) for all n, n
no.
Eg 2:- 10n2 +4n+2
f(n)= 10n2 +4n+2
10n2 +4n+2 n2 for all
n5
10n2 +4n+2 11n2 for
all n5

O-notation

---Less than equal to ()


-notation --- Greater than equal to()
-notation ---Equal to (=)

Asymptotic notations are used to formalize that an algorithm has


running time or storage requirements that are ``less than,''
``always greater than,' or ``exactly'' some amount

Little

oh : The function f(n) = o(g(n)) iff

Eg:- 3n+2 = o(n2)


Little

omega : The function f(n) =


w(g(n)) iff

Recurrence Relation
When

an algorithm contains a recursive call


to itself, its running time can often be
described by a recurrence
A recurrence is an equation or inequality
that describes a function in terms of its
value on smaller inputs
For eg:- The worst case running time T(n) of
the merge-sort is described by the recurrence
T(n) =

if n=1
2T(n/2) +(n) if n>1
(1)

Iterative method for solving


recurrence relation
The

basic idea is to expand the recurrence


and express it as a summation of terms
dependent only on n and initial condition.
Solve the recurrence eqn: using iterative
method
1.T(n) = 3T(n/4) + n .(1)
Initial condition T(1) = (1)
Expand the term T(n/4) using eqn (1)
T(n/4)= 3T(n/42) + n/4 .substitute in eqn(1)
Eqn (1)=> T(n) = 3{3T(n/42) + n/4 }+ n
=9T(n/42) + 3n/4 + n
Expand T(n/42)

T(n) = 3{3T(n/42) + n/4 }+ n


Expand
T(n/42) = 3T(n/43) + n /42
=32T(n/42) + 3n/4 + n
=32{3T(n/43) + n/42} + 3n/4 + n
= 33T(n/43) + 32n/42 + 3n/4 + n
n
n

+ 3n/4 + 32n /42 + 33T(n/43)


+ 3n/4 + 32n /42 +..+ 3iT(n/4i)
n + 3n/4 + 32n /42 +..+ 3 log n T(1)
4

1.T(n)

= 2T(n/2)+3n2

Potrebbero piacerti anche