Sei sulla pagina 1di 14

Conjugate Gradient Method

Presented by
Yash Menon

INTRODUCTION

ALGORITHM

EXAMPLE

INTRODUCTION - 1
Solution of many engineering problems lead to
system of linear equations.
These system of equations can be solved by direct
or iterative methods like Gauss elimination, Jacobi
method, SOR method.
These methods are applicable only when number of
unknowns are less than 100.
It has been observed that for many practical 3D
applications, the number of unknowns are in the
range of millions or sometimes in billions.
To solve such large system of equations we need a
method which can converge faster in minimum time
with minimum usage of memory.
Conjugate Gradient method is one which is most
popular and extensively used method for such

INTRODUCTION - 2
In 1952, two great mathematicians, Magnus Hestenes
and Eudard Stiefel discovered Conjugate Gradient
method which is one of the most powerful iteration
methods for rapidly solving large linear systems.
Conjugate Gradient method guarantee to converge in n
iterations where n is the order of the matrix.
To get unknowns we express the system of linear
equations in the matrix format.
[A]{x} = {b}
Where,
A is coefficient matrix
x is column matrix of unknowns
b is column matrix of constant
Coefficient matrix [A] formed during computation is
Sparse matrix.
Sparse matrix having large number of rows and
columns in which very few elements are non zero and rest
of the elements are zero.

INTRODUCTION - 3

Example of Sparse matrix

ALGORITHM BASIC-1

f ( x)

1 T
x Ax xT b
2

ALGORITHM BASIC-2
Before the discovery of Conjugate Gradient
method , Steepest descent method was the most
reliable and popular method.

The problem with the Steepest descent method is,


it works very well when we are far away from the
solution, but as we come close to the solution it
becomes very slow.
Conjugate gradient method is also called as

ALGORITHM BASIC-3

ALGORITHM

EXAMPLE
Consider the linear system Ax = b given by

Let the initial guess be


Our first step is to calculate the residual vector r0
r0 = b - Ax0

Now compute the scalar 0 using the relationship

Now compute x1 using the formula


First iteration
result

compute the scalar 0

Now, using this scalar 0 compute p1

Now compute the scalar 1 using our


newly
acquired p1

Finally, we find x2

Exact
Exact
Solution
Solution

REFERENCE
REFERENCES
[1] Auke van der Ploeg, Preconditioning for sparse
matrices with applications, University of Gorningen,
1994.
[2] Elena Caraba, Preconditioned Conjugate Gradient
Algorithm, Louisiana State University, 2008.
NPTEL video lecture on Conjugate Gradient Method
by
[1] Prof Sachin Parwardhan, IIT Bombay.
[2] Dr. Arghya Deb, IIT Kharagpur.
[3] Prof C. Balaji, IIT Madras.
[4] Prof. Sreenivas Jayanti, IIT Madras.

Potrebbero piacerti anche