Sei sulla pagina 1di 36

Review of Linear Algebra

Vectors

We will next give a brief review of some concepts from Linear Algebra
that will help in the understanding of Matrix Representations of
Graphs and also in the discussion of Dynamical Systems later in the
course.
Definition
An n-dimensional vector is a column array of real numbers
 
v1
 v2 
v =  .  vi ∈ R, i = 1, . . . , n
 
 .. 
vn

To save space we often write this as a row vector

v = [v1 , v2 , . . . , vn ]T

where the superscript T denotes Transpose - which means interchanging


rows and columns.
Mark W. Spong Dynamics of Complex Networks and Systems
Review of Linear Algebra
Matrices

Definition
An n × n matrix A is an array with n rows and m columns
 
a11 a12 . . . a1m
 a21 a22 . . . a2m 
A= .
 
.. .. .. 
 .. . . . 
an1 an2 . . . anm

We write A = (aij ) with aij meaning the element in row i and column j
[ai1 , ai2 , . . . , aim ] is the
 i-th row vector
a1j
 a2j 
[a1j , a2j , . . . , anj ]T =  .  is the j-th column vector
 
 .. 
anj

Mark W. Spong Dynamics of Complex Networks and Systems


Review of Linear Algebra
Matrices

Example
 
1 2 3 4
A= 5 6 7 8 
9 10 11 12
is a 3 × 4 matrix.
[5, 6, 7, 8] is the second
 row vector or row two
1
[1, 5, 9]T =  5  is column one.
9
The transpose matrix AT is the 4 × 3 matrix
 
1 5 9
 2 6 10 
AT =  3 7 11 

4 8 12

Mark W. Spong Dynamics of Complex Networks and Systems


Review of Linear Algebra
Matrices

Definition
Given vectors x = [x1 , . . . , xn ]T and y = [y1 , . . . , yn ]T , the inner
product or dot product, or scalar product denoted by

< x, y > or x · y or xT y

is a scalar (number)
 
y1
 y2 
xT y = x1 y1 + x2 y2 + · · · + xn yn = [x1 , x2 , . . . xn ] 
 
.. 
 . 
yn

Example
Let x = [1, 2, 3]T and y = [4, 5, 6]T be two vectors in R3 . Then

xT y = 1 · 4 + 2 · 5 + 3 · 6 = 32

Mark W. Spong Dynamics of Complex Networks and Systems


Review of Linear Algebra
Inner Product

Definition
The norm or length of a vector x = [x1 , x2 , . . . , xn ]T is given by

√ n
1 1
X
||x|| = xT x = (x21 + x22 + . . . x2n ) 2 = ( x2i ) 2
i=1

The norm, as defined above, is the n-dimensional version of the


Pythagorian Theorem.

Some facts you may recall from basic vector calculus

x · y = ||x|| · ||y|| cos(θ) where θ is the angle between the vectors x


and y.
Consequently, |xT y| ≤ ||x|| · ||y|| Cauchy-Schwartz Inequality
Also, x · y = 0 if and only if x and y are mutually perpendicular
(orthogonal).

Mark W. Spong Dynamics of Complex Networks and Systems


Review of Linear Algebra
Matrix Multiplication

If A is an n × m matrix and B is a p × q matrix, then the Matrix


Product AB exists provided m = p.

The result is an n × q matrix C = (cij ) where cij = ATi Bj where Ai is


the i-th row of A and Bj is the j-th column of B.

Example
Suppose
   
1 −1 1 3 0 1
A= 2 1 3  ; B= 0 1 2 
−2 1 4 −2 2 −1

then  
1 1 −2
C= 0 7 1 
−14 5 −4

Mark W. Spong Dynamics of Complex Networks and Systems


Review of Linear Algebra
Matrix Multiplication

Some additional definitions and properties of matrix algebra:

A matrix A is Symmetric if aij = aji . In other words, AT = A, i.e,


the i-th row and j-th column of A are the same.

Example
Consider the matrix  
0 1 0 1
 1 0 1 0 
A=
 0

1 1 1 
1 0 1 0
Then it is easy to see that AT = A, so A is a symmetric matrix.

Mark W. Spong Dynamics of Complex Networks and Systems


Review of Linear Algebra
Matrix Algebra

Some additional properties of matrices are:

If A and B have the same dimensions n × m, then C = A + B is


defined by cij = aij + bij .
(AB)C = A(BC) provided the matrix products are defined.
A(B + C) = AB + AC provided the matrix products are defined.
(A + B)C = AC + BC provided the matrix products are defined.
(AT )T = A
(A + B)T = AT + B T
(AB)T = B T AT

If A and B are n × n (square matrices), then in general AB is not


equal to BA.
If AB = BA then A and B are said to commute.

Mark W. Spong Dynamics of Complex Networks and Systems


Review of Linear Algebra
Matrix Inverse

Definition
The matrix  
1 0 ... 0
 0 1 ... 0 
I=
 
.. .. .. 
 . . . 
0 0 ... 1
is the n × n Identity Matrix

Definition
The Inverse of an n × n matrix A is an n × n matrix B satisfying

AB = BA = I

where I is the n × n identity matrix.


We denote the inverse B of A as A−1 .

Mark W. Spong Dynamics of Complex Networks and Systems


Review of Linear Algebra
Systems of Linear Equations

Matrices are used to represent Systems of Linear Equations


Example
The system of m equations in n unknowns

a11 x1 + a12 x2 + · · · + a1m xn = b1


a21 x1 + a22 x2 + · · · + a2m xn = b2
.. .. ..
. . .
am1 x1 + am2 x2 + · · · + amn xn = bn

can be written as
Ax = b

Mark W. Spong Dynamics of Complex Networks and Systems


Review of Linear Algebra
Systems of Linear Equations

with
     
a11 a12 ... a1m x1 b1
 a21 a22 ... a2m   x2   b2 
A=  ; x=  ; b=
     
.. .. .. .. .. .. 
 . . . .   .   . 
an1 an2 ... anm xm bm

Consider the homogeneous linear equations

ax + by = 0
cx + dy = 0

Eliminating x and y from these equations gives

ad − bc = 0
 
a b
This quantity is called the Determinant of the matrix .
c d
Mark W. Spong Dynamics of Complex Networks and Systems
Review of Linear Algebra
Determinant and Inverse

We denote the determinant of a matrix A by det(A) or |A|.


The determinant of a 3 × 3 matrix can be computed as

a b c
d e f = a e f − b d f + c d e

h i g i g h
g h i
= a(ei − f h) − b(di − f g) + c(dh − ge)
= aei + bf g + cdh − ceg − bdi − af h

e f
, − d f , d e are called

The 2 × 2 determinants
h i g i g h
Cofactors of the elements a, b, c, respectively.

Mark W. Spong Dynamics of Complex Networks and Systems


Review of Linear Algebra
Cofactors

Definition
The Cofactor cij of an element aij in an n × n matrix A is ±1 times the
determinant of the (n − 1) × (n − 1) matrix formed by deleting the i-th
rown and j-th column of A.
The sign in front of each cofactor alternates according to the pattern
 
+ − + −
 − + − + 
 + − + −  sign pattern for the 4 × 4 case.
 

− + − +

The Determinant of any n × n matrix A can then be calculated by


taking any row or column and multiplying each element of the row or
column by its respective cofactor. The determinant is then the sum of
these products.

Mark W. Spong Dynamics of Complex Networks and Systems


Review of Linear Algebra
Determinant

Example
Back to the previous 3 × 3 matrix

a b c

d e f

g h i

we can take any row or column, for example, column two, and compute
the determinant as

d f a c a c
= −b + e − h

g i g i d f
= −b(di − f g) + e(ai − gc) − h(af − dc)
= −bdi + bf g + eai − egc − haf + hdc

One can check that this is the same expression as computed previously.

Mark W. Spong Dynamics of Complex Networks and Systems


Review of Linear Algebra
Determinant and Inverse

The determinant is a scalar function defined for square matrices and


satisfies the following properties.

|AB| = |A| · |B|


|A−1 | = |A|
1
= |A|−1
|AT | = |A|

Note that it is generally not true that |A + B| = |A| + |B|.


Definition
A matrix A is Singular if det(A) = 0. Otherwise, A is said to be
Nonsingular or Invertible.

Theorem
The inverse of an n × n matrix exists if and only if the Determinant,
det(A), is not equal to zero.
If A and B are invertible, then the product AB is invertible and
(AB)−1 = B −1 A−1 .
Mark W. Spong Dynamics of Complex Networks and Systems
Review of Linear Algebra
Determinant and Inverse

If the n × n matrix A is nonsingular, the linear system

Ax = b

has the unique solution


x = A−1 b
Otherwise, there may be no solution or infinitely many solutions.
Example
Consider the linear system

3x − y = 4
6x − 2y = 8

The coefficient matrix A is singular in this case and any point on the line
y = 3x − 4 is a solution.

Mark W. Spong Dynamics of Complex Networks and Systems


Review of Linear Algebra
Determinant and Inverse

Example
On the other hand, the linear system

3x − y = 4
6x − 2y = 2

has no solution.

Example
The linear system

3x − y = 4
6x + 2y = 2

has the unique solution x = 5/6, y = −3/2.

Mark W. Spong Dynamics of Complex Networks and Systems


Review of Linear Algebra
Matrix Inverse

   
a b d −b
Let A = and B =
c d −c a
Then a direct calculation shows
    
a b d −b ad − bc 0
AB = =
c d −c a 0 ad − bc

A similar calculation shows that BA = AB.


Therefore, it follows that
 
−1 1 d −b 1 +
A = = A
ad − bc −c a |A|

which is well defined provided |A| =6 0.


 
d −b
The matrix A+ = is called the Adjoint of A.
−c a

Mark W. Spong Dynamics of Complex Networks and Systems


Review of Linear Algebra
Matrix Inverse

The inverse of a nonsingular n × n matrix A is A−1 = 1 +


|A| A .

The Adjoint of a general n × n matrix A is given as A+ = C T where C


is the Cofactor Matrix, consisting of elements that are cofactors of the
elements of A as defined previously.
Example
Find the adjoint of the matrix
 
1 2 3
A= 0 4 5 
1 0 6

First, compute the cofactor of each element of A

Mark W. Spong Dynamics of Complex Networks and Systems


Review of Linear Algebra
Matrix Inverse

Example
The cofactors of the given matrix are

4 5 0 5 0 4
c11 = = 24 c12 = − = 5 c13 =
= −4
0 6 1 6 1 0

2 3 1 3 1 2
c21 = − = −12 c22 = = 3 c23 = −
=2
0 6 1 6 1 0

2 3 1 3 1 2
c31 = = −2 c32 = − = −5 c33 = 0 4 =4

4 5 0 5

Mark W. Spong Dynamics of Complex Networks and Systems


Review of Linear Algebra
Example

Example
Therefore, the Cofactor Matrix is
 
24 5 −4
C =  −12 3 2 
−2 −5 4

Finally, the Adjoint of A is the transpose of the Cofactor Matrix


 
24 −12 −2
A+ =  5 3 −5 
−4 2 4

Since the determinant of A is |A| = 22, it follows that


 
24 −12 −2
1
A−1 =  5 3 −5 
22
−4 2 4

Mark W. Spong Dynamics of Complex Networks and Systems


Review of Linear Algebra
Trace of a Matrix

Definition
Given an n × n matrix A, the Trace of the matrix is
n
X
T r(A) = a11 + a22 + · · · + ann = aii
1

In other words, the Trace is the sum of the diagonal entries of the
matrix.

The Trace operation satisfies the following:

T r(A + B) = T r(A) + T r(B)


T r(cA) = cT r(A) where c is a constant.
T r(AB) = T r(BA)
T r(AT ) = T r(A)

Mark W. Spong Dynamics of Complex Networks and Systems


Review of Linear Algebra
Eigenvalues and Eigenvectors

Definition
Suppose we find a scalar λ and a vector x satisfying the equation

Ax = λx

Then λ is called an Eigenvalue of the matrix A and x is called an


Eigenvector for λ.

Example
      
3 −1 1 41
= =4
−1 3 −1 −4
−1
 
3 −1
Therefore λ = 4 is an eigenvalue for the matrix A = and
−1 3
 
1
x= is an eigenvector for λ.
−1

Mark W. Spong Dynamics of Complex Networks and Systems


Review of Linear Algebra
Finding Eigenvalues and Eigenvectors

Note that the equation Ax = λx can be written as

(A − λI)x = 0

where I is the identity matrix defined previously.


Therefore, there will be a nonzero vector x satisfying this equation
provided that
det(A − λI) = 0

Computing det(A − λI) results in a polynomial of degree n for an n × n


matrix, called the Characteristic Polynomial of A.
The n roots of the characteristic polynomial, which may be real or
complex, are therefore the n eigenvalues of A.

Mark W. Spong Dynamics of Complex Networks and Systems


Review of Linear Algebra
Eigenvalues and Eigenvectors

Theorem
If A is a real, symmetric matrix then the eigenvalues of A are real.

Example
Back to the previous matrix
 
3 −1
A=
−1 3

Then
 
3−λ −1
det(A − λI) = det( ) = λ2 − 6λ + 8 = (λ − 4)(λ − 2)
−1 3−λ

Therefore the eigenvalues are λ = 4 and λ = 2.


To find eigenvectors for each λ we need to solve the equation
    
3−λ −1 x1 0
=
−1 3 − λ x2 0

Mark W. Spong Dynamics of Complex Networks and Systems


Review of Linear Algebra
Eigenvalues and Eigenvectors

Example
For λ = 4, the system of equations become
    
−1 −1 x1 0
=
−1 −1 x2 0
   
x1 1
Thus, x2 = −x1 and so = is an eigenvector for λ = 4.
x2 −1
For λ = 2, the system of equations become
    
1 −1 x1 0
=
1 −1 x2 0
   
x1 1
Thus, x2 = +x1 and so = is an eigenvector for λ = 2.
x2 1

Mark W. Spong Dynamics of Complex Networks and Systems


Review of Linear Algebra
Eigenvalues and Eigenvectors

Definition
Vectors v1 and v2 are said to be Linearly Dependent if and only if there
are constants α1 and α2 such that

α1 v1 + α2 v2 = 0

Otherwise, v1 and v2 are Linearly Independent.

If v1 and v2 are linearly independent, then the matrix T = [v1 v2 ] is


invertible, where v1 and v2 are the column vectors of T .

Mark W. Spong Dynamics of Complex Networks and Systems


Review of Linear Algebra
Eigenvalues and Eigenvectors

Suppose v1 and v2 are two linearly independent eigenvectors for


eigenvalues λ1 and λ2 , respectively.
Then, since Avi = λi vi for i = 1, 2 we can write

A[v1 v2 ] = [λ1 v1 λ2 v2 ]

which can be written  


λ1 0
AT = T
0 λ2
and so  
−1 λ1 0
T AT = = Ā
0 λ2
a diagonal matrix with the eigenvalues on the diagonal.
The above transformation T −1 AT is called a Similarity Transformation
and the matrices A and Ā are said to be Similar.

Mark W. Spong Dynamics of Complex Networks and Systems


Review of Linear Algebra
Eigenvalues and Eigenvectors

Example
In the previous example, with
   
3 −1 1 1
A= ; and T =
−1 3 1 −1

a straightforward calculation shows that


 
1 1 1
T −1 =
2 1 −1

and
     
−1 1 1 1 3 −1 1 1 2 0
T AT = =
2 1 −1 −1 3 1 −1 0 4

Mark W. Spong Dynamics of Complex Networks and Systems


Review of Linear Algebra
Eigenvalues and Eigenvectors

Remark
1) The determinant of a square matrix A is the product of the
eigenvalues, i.e.,
|A| = λ1 λ2 · · · λn = Πni=1 λi
2) The trace of a square matrix A is the sum of the eigenvalues, i.e.,
n
X
T r(A) = λ1 + λ2 + · · · + λn = λi
i=1

These properties are invariant under similarity transformation.

This is because the determinant and trace satisfy

|T −1 AT | = |A|
T r(T −1 AT ) = T r(A)

Mark W. Spong Dynamics of Complex Networks and Systems


Review of Linear Algebra
Quadratic Forms

We will have occasion to consider so-called Quadratic Forms.


Definition
A Quadratic Form V is a function from Rn → R of the form
n X
X n
V (x) = pij xi xj = p11 x22 + p12 x1 x2 + · · · + pnn x2n
i=i j=1

We assume pij = pji .

Such a quadratic form can be represented as

V (x) = xT P x

where P = (pij ) is a symmetric n × n matrix and xT = [x1 , . . . , xn ].

Mark W. Spong Dynamics of Complex Networks and Systems


Review of Linear Algebra
Quadratic Forms

Example
The quadratic form

V (x) = p11 x21 + p12 x1 x2 + p21 x2 x1 + p22 x22

can be written as
V (x) = xT P x
where  
p11 p12
P =
p21 p22

The simplest example of such a quadratic form is the norm

V (x) = x21 + x22 = xT Ix

where I is the identity matrix.

Mark W. Spong Dynamics of Complex Networks and Systems


Review of Linear Algebra
Quadratic Forms

Note that the equation


V (x) = x21 + x22 = r2
defines a circle of radius r.
In general, the equation
V (x) = xT P x = c
defines an ellipse provided the matrix P is Positive Definite.
Definition
An n × n matrix P = (pij ) is Positive Definite if

xT P x > 0 for all x 6= 0

P is Positive Semi-Definite or Nonnegative Definite if

xT P x ≥ 0 for all x 6= 0

P is Negative (Semi) Definite if −P Positive (Semi) Definite


Mark W. Spong Dynamics of Complex Networks and Systems
Review of Linear Algebra
Quadratic Forms

Theorem
A matrix P is Positive Definite if and only if all eigenvalues of P are
positive.
A matrix P is Positive Semi-Definite if and only if all eigenvalues of P
are non-negative.

Another characterization of a positive definite matrix is


Theorem
A matrix P is Positive Definite if and only if all Principal Minors or
Principal Minor Determinants of P are positive.

Mark W. Spong Dynamics of Complex Networks and Systems


Review of Linear Algebra
Principal Minors

Definition
Let  
p11 p12 ... p1n
 p21 p22 ... p2n 
P =
 
.. .. .. .. 
 . . . . 
pn1 pn2 ... pnn
be an n × n matrix.
The Principal Minors of P are

p11 p12 p13
p p12
M2 = 11

M1 = p11 ; ; M3 = p21 p22 p23
p21 p22
p31

p23 P33
. . . ; Mn = |P |

Mark W. Spong Dynamics of Complex Networks and Systems


Review of Linear Algebra
Positive Definite Matrices

Remark
If the matrix P can be written as P = C T C for some matrix C, then P
is Positive Semi-Definite.

To see this, note that a simple calculation shows that

xT P x = xT C T Cx = y T y ≥ 0 with y = Cx

y is not strictly positive since there will generally be nonzero vectors x


such that Cx = 0.

Mark W. Spong Dynamics of Complex Networks and Systems

Potrebbero piacerti anche