Documenti di Didattica
Documenti di Professioni
Documenti di Cultura
Gaurav Gehlot
(2014uch1546)
Komal Maheshwari
DEPARTMENT OF CHEMICAL ENGINEERING
(2014uch1556)
MALAVIYA NATIONAL INSTITUTE OF TECHNOLOGY
Reshma S Raju JAIPUR
2016-2017
(2014uch1557)
CONJUGATE GRADIENT METHOD
This method is invented by Hestenes and Stiefel around 1951.This method is also known as
Fletcher-Reeves Method.
The conjugate gradient method involves the use of the gradient of the function.
This method does not repeat any previous search direction and converge in N
iterations
2/4/17 2
CONJUGATE GRADIENT METHOD
We have seen that Powells conjugate direction method requires
n single- variable minimizations per iteration and sets up a new
conjugate direction at the end of each iteration.
2/4/17 3
ITERATIVE PROCEDURE
1. Start with an arbitrary initial point X1.
where 1* is the optimal step length in the direction S1. Set i=2 and go to the
next step. 2
f i
4. Find fi = f(Xi), and set
S i -f i 2
S i-1
f i 1
5.Compute the optimum step length i* in the direction Si, and find the new point
X i 1 X i *i S i
6.Test for the optimality of the point Xi+1. If Xi+1 is optimum, stop the
process. Otherwise set the value of i=i+1 and go to step 4.
2/4/17 4
SOLVED EXAMPLE
Minimize
Solution:
Iteration 1
2/4/17 5
SOLVED EXAMPLE
To find the optimal step length 1* along S1, we minimize f(x1 + 1S1)
with respect to 1. Here
Therefore
2/4/17 6
SOLVED EXAMPLE
Iteration 2: Since
2
the equation f i
S i -f i 2
S i-1
f i 1
Therefore
2/4/17 7
EXAMPLE
To find 2*, we minimize
Thus the optimum point is reached in two iterations. Even if we do not know this point to
be optimum, we will not be able to move from this point in the next iteration. This can be
verified as follows:
2/4/17 8
EXAMPLE
Iteration 3:
Now
Thus,
This shows that there is no search direction to reduce f further, and hence X3 is optimum.
2/4/17 9
REFERENCE
2/4/17 10