Sei sulla pagina 1di 10

CONJUGATE GRADIENT METHOD

SUBMITTED TO: SUBMITTED BY:


Dr. Rajeev Kumar Dohare Nikhil Kumar
(Assistant Professor) (2014uch1534)

Gaurav Gehlot
(2014uch1546)

Komal Maheshwari
DEPARTMENT OF CHEMICAL ENGINEERING
(2014uch1556)
MALAVIYA NATIONAL INSTITUTE OF TECHNOLOGY
Reshma S Raju JAIPUR
2016-2017
(2014uch1557)
CONJUGATE GRADIENT METHOD

This method is invented by Hestenes and Stiefel around 1951.This method is also known as
Fletcher-Reeves Method.

The conjugate gradient method is an algorithm for the numerical solution of


particular systems of linear equations, namely those whose matrix is symmetric
and positive-definite.

The conjugate gradient method involves the use of the gradient of the function.
This method does not repeat any previous search direction and converge in N
iterations

2/4/17 2
CONJUGATE GRADIENT METHOD
We have seen that Powells conjugate direction method requires
n single- variable minimizations per iteration and sets up a new
conjugate direction at the end of each iteration.

Thus, it requires in general n2 single-variable minimizations to


find the minimum of a quadratic function.

On the other hand, if we can evaluate the gradients of the objective


function, we can set up a new conjugate direction after every one
dimensional minimization, and hence we can achieve faster
convergence.

2/4/17 3
ITERATIVE PROCEDURE
1. Start with an arbitrary initial point X1.

2. Set the first search direction S1 = - f (X1)=- f1

3. Find the point X2 according to the X 2 X 1 1 S 1


relation *

where 1* is the optimal step length in the direction S1. Set i=2 and go to the
next step. 2
f i
4. Find fi = f(Xi), and set
S i -f i 2
S i-1
f i 1

5.Compute the optimum step length i* in the direction Si, and find the new point

X i 1 X i *i S i
6.Test for the optimality of the point Xi+1. If Xi+1 is optimum, stop the
process. Otherwise set the value of i=i+1 and go to step 4.
2/4/17 4
SOLVED EXAMPLE
Minimize

starting from the point

Solution:
Iteration 1

The search direction is taken as:

2/4/17 5
SOLVED EXAMPLE
To find the optimal step length 1* along S1, we minimize f(x1 + 1S1)
with respect to 1. Here

Therefore

2/4/17 6
SOLVED EXAMPLE
Iteration 2: Since

2
the equation f i
S i -f i 2
S i-1
f i 1

gives the next search direction as 2


f 2
S 2 -f 2 2
S1
f 1
where

Therefore

2/4/17 7
EXAMPLE
To find 2*, we minimize

with respect to 2. As df/d 2=8 2-2=0 at 2*=1/4, we obtain:

Thus the optimum point is reached in two iterations. Even if we do not know this point to
be optimum, we will not be able to move from this point in the next iteration. This can be
verified as follows:

2/4/17 8
EXAMPLE
Iteration 3:
Now

Thus,

This shows that there is no search direction to reduce f further, and hence X3 is optimum.

2/4/17 9
REFERENCE

Rao, S.S., Engineering Optimization, 3rd ed., Wiley., 386.

2/4/17 10

Potrebbero piacerti anche