Documenti di Didattica
Documenti di Professioni
Documenti di Cultura
http://www.cs.aaue.dk/~ymzhang/courses/NumMethod/S2006.html
Youmin Zhang
Lecture 1
Not to be
covered
Course
Contents
Lecture 1
Lecture 1
Nonlinear equations
Newtons method:
Overview (PT3.2)
Lecture 1
Lecture 1
Lecture 1
Lecture 4: Optimization
f1 ( x1 , x2 , L xn ) = 0
f 2 ( x1 , x2 , L xn ) = 0
M
M
f n ( x1 , x2 , L xn ) = 0
=0
Lecture 1
a21 ( x1 ) 3 + a22 e x2 + L + a2 n ( x2 ) 3 / xn = b2
M
M
a x + a x + L + a x = b
nn n
n
n1 1 n 2 2
Lecture 1
Matrix/vector
representation
Solutions of
equations with
multiple variables
A set of equations:
Typical methods: Bisection method; NewtonRaphson method; Secant method; and Mllers
method (Chapters 5-7)
We will not elaborate more on this topic
Overview
11
Lecture 1
Lecture 1
1. Matrix
Matrix Notation
a 21 x1 + a 22 x2 + L + a 2 n xn = b2
M
M
a n1 x1 + a n 2 x2 + L + a nn xn = bn
12
10
Lecture 1
Lecture 1
Square Matrix
15
13
Element: aij
Matrix: [A]
Lecture 1
1
3
7
39
Identity
Diagonal
Lower Triangular
a11
a 21 a22
[ A] =
M
O
a n1 LL ann
[I ] =
a11
a 22
[ D] =
a nn
a 22 L a 2 n
[ A] =
O
M
a nn
Upper Triangular
2 16
7 39
9 6
6 88
Symmetric
5
1
[ A] =
2
16
16
14
{C}T = [c1 c2 L cn ]
Lecture 1
c1
c
{C} = 2
M
c n
Column vectors:
Row vectors:
17
(E1.2)
(E1.1)
Lecture 1
3 x1 + 2 x2 = 18
8 x2 = 24
x1 = 4
x2 = 3
(E1.1)
(E1.2)
3 x1 + 2 x2 = 18
x1 + 2 x2 = 2
19
Lecture 1
[ A]1
k =1
a21 x1 + a22 x2 + L + a2 n xn = b2
Lecture 1
Lecture 1
a12 L a1n x1 b1
a22 L a2 n x2 b2
=
M M M M M
an 2 L ann xn bn
or
[ A]{ X } = {B}
Two-step procedure:
Observation
a11
a
21
M
an1
an1 x1 + an 2 x2 + L + ann xn = bn
or in matrix form, it can be written as
Gauss Elimination
20
18
Forward Elimination
(t1.k )
(t1.k )
ak 1
a
a
) x2 + L + (akn k1 ) xn = bk - k1 b1
a11
a11
a11
ak 2 x2 + L + akn xn = bk
21
Lecture 1
M
M
an1 x1 + an 2 x2 + L + ann xn = bn
23
(t1.n)
(t1.2)
(t1.1)
( ak 2
Lecture 1
M
M
an1 x1 + an 2 x2 + L + ann xn = bn
(t1.n)
(t1.2)
(t1.1)
Lecture 1
(t1.n (n -1) )
(t1.3)
(t1.1)
(t1.2)
24
22
See blackboard!
(t1.n)
(t1.2)
(t1.1)
x2 + a23
x3 + L + a2 n xn = b2
a22
x3 + L + a3n xn = b3
a33
( n 1)
ann
xn = bn( n 1)
Lecture 1
M
M
an1 x1 + an 2 x2 + L + ann xn = bn
b
a
Lecture 1
j = i +1
( i 1)
ii
aiji 1 x j
Back Substitution
Forward Elimination
Pseudocode to
Perform Gauss
Elimination
Lecture 1
xi =
bi(i 1)
2. The value of xn can be back-substituted into the (n1)th equation to solve for xn-1. This procedure repeats
for all subsequent equations.
xn =
( n 1)
n
( n 1)
nn
x2 + a23
x3 + L + a2 n xn = b2
a22
x3 + L + a3n xn = b3
a33
( n 1)
ann
xn = bn( n 1)
Back Substitution
27
25
(t1.n (n -1) )
Lecture 1
Lecture 1
(3)'(2)'(2.6 / 2)
x3
6
-5
-8.1
6
-5
-1.6
6
7
2
x2
4
-2
0
5
0
0
x1
4
-2
2.6
4
6
5
ak 1
a
a
) x2 + L + (akn k1 ) xn = bk - k1 b1
a11
a11
a11
5
0
0
5
10
3
( ak 2
Summary
(3)
3 x1 + 5 x2 + 2 x3 = 8
10 x1 + 6 x2 + 7 x3 = 25 (2)
(t1.3)
(1)
5 x1 + 4 x2 + 6 x3 = 11
(t1.1)
(t1.2)
Example 2
11
3
5.3
11
3
1.4
11
25
8
28
26
(t1.k)
Pivot equation
Forward Elimination
Pivot element
ak 1
a
a
) x2 + L + (akn k1 ) xn = bk - k1 b1
a11
a11
a11
(t1.k )
(t1.k )
29
Lecture 1
31
xn = bk
a2 k x2 + L + akn
( a2 k
Lecture 1
Lecture 1
Ill-conditioned Systems
Round-off Errors
Division by Zero
Lecture 1
32
30
Lecture 1
(c ) too close
2
x2 =
3
(E2.2)
(E2.1 )
9999 x2 = 6666
Step 2: (E2.2)-(E2.1)
(E2.2)
(E2.1)
Lecture 1
(a) no solution
35
33
Lecture 1
Lecture 1
2
3
0.6667
0.66667
0.666667
0.6666667
4
5
6
7
0.667
x2
2.0001 3 x2
0.0003
0.67
x1 =
Significant
Figures
x2 =
8
{B} = -3
5
(E1.3)
(E1.2)
(E1.1)
0.333000
0.33000
0.3000
0.000
-3.00
-33.0
x1
Example 2: Observation
a11 = 0
0 2 3
[ A] = 4 6 7
2 1 6
2 x2 + 3 x3 = 8
4 x1 + 6 x2 + 7 x3 = 3
2 x + x + 6 x = 5
3
1 2
36
34
Lecture 1
x2 =
2
3
(E2.1 )
(E2.2)
2.9997 x2 = 1.9998
1.0000 x2
x1 =
1.0000
(E2.1)-(E2.2)
(E2.1)
(E2.2)
(E2.2)
(E2.1)
(E2.2)0.0003/1.0000 yields
Lecture 1
Scaling
Pivoting
39
37
(E1.3)
(E1.2)
(E1.1)
(E2.2)
(E2.1)
Lecture 1
Lecture 1
2
3
0.6666667
0.666667
0.66667
0.6667
4
5
0.667
x2
1.0000 x2
1.0000
0.67
x1 =
Significant
Figures
x2 =
Observation
0.3333333
0.333333
0.33333
0.3333
0.333
0.33
x1
40
38
2 x2 + 3 x3 = 8
4 x1 + 6 x2 + 7 x3 = 3
2 x + x + 6 x = 5
3
1 2
Pivoting
Complete Pivoting
Lecture 1
Lecture 1
Switch rows
Pivoting:
Pseudocode
43
41
(E1.3)
(E1.2)
(E1.1)
0.00002 x1 + x2 = 1
x1 + x2 = 2
2 x1 + 100,000 x2 = 100,000
x2 = 2
x1 +
Scaling
(E3.2)
(E3.1)
(E3.2)
(E3.1)
Lecture 1
Lecture 1
Solution:
2 x2 + 3x3 = 8
4 x1 + 6 x2 + 7 x3 = 3
2 x + x + 6 x = 5
3
1 2
44
42
(E3.2)
(E3.1)
x1 = x2 = 1.00
Back-substitution
x2 = 2
x1 +
99,998 x2 = 99,996
Forward elimination
x2 = 2
x1 +
2 x1 + 100,000 x2 = 100,000
0.00002 x1 + x2 = 1
x1 + x2 = 2
(E3.2)
(E3.1)
Lecture 1
(E3.2)
(E3.1)
45
x2 = 1.00 x1 = 0.00
Back substitution
49,999 x2 = 49,998
Forward elimination
2 x1 + 100,000 x2 = 100,000
x2 = 2
x1 +
Scaling: Example
(E3.2)
(E3.1)
Lecture 1
48
(E3.2)
(E3.1)
0.00002 x1 + x2 = 1
x1 + x2 = 2
2 x1 + 100,000 x 2 = 100,000
x2 = 2
x1 +
Scaling: Observation
x1 = x2 = 1.00
Back-substitution
x1 + x2 = 2
x2 = 1.00
Forward elimination
x1 + x2 = 2
0.00002 x1 + x2 = 1
Pivoting
0.00002 x1 + x2 = 1
x1 + x2 = 2
Lecture 1
Lecture 1
A variation of Gauss
elimination
Major differences:
an unknown has to be
eliminated from all
equations
all rows are normalized
the elimination results
in an identify matrix
no need for back
substitution for the
solution
can be used to obtain
matrix inverse
Gauss-Jordan
Elimination
51
49
4
6
5
4
-2
2.6
4
-2
0
4
-2
0
4
-2
0
4
1
0
0
1
0
0
1
0
6
7
2
6
-5
-2
6
-5
-8.1
6
-5
1
0
0
1
0
0
1
0
0
1
0
0
1
Summary
5
10
3
5
0
0
5
0
0
5
0
0
5
0
0
5
0
0
5
0
0
1
0
0
11.00000
25.00000
8.00000
11.00000
3.00000
1.400000
11.00000
3.00000
5.30000
11.00000
3.00000
-0.65432
14.92593
-0.27160
-0.65432
14.92593
0.13580
-0.65432
14.38272
0.13580
-0.65432
2.87654
0.13580
-0.65432
x1
x2
x3
Lecture 1
9 Scaling
9 Pivoting
Solution:
3 x1 + 5 x2 + 2 x3 = 8
10 x1 + 6 x2 + 7 x3 = 25
Example:
5 x1 + 4 x2 + 6 x3 = 11
52
Lecture 1
53
Exercise:
Reading: