Sei sulla pagina 1di 21

3.1.

2 Matrix algebra and eigen-


problem
Vector and Vector Space





Vector Norm
We can always find a scalar o such that
u = o v
u
T
u = 1
1
2
n
v
v
=
...
v
v



`


)
T 2 2
1 1 n
1 n
n
1 1 1
= = =
v v +...v
[v ... v ] ...
v
v v

`

)
1/o= is called the Euclidian norm or 2-norm of vector v
T
v v
Example
x



x

y y
A
v

v
1


B
v

u
2
B
q
v
2


1 A
q

u
u
1

1
2
v
=
v
v

`
)
1
2
u

u
u v

= =
`
)
1
1
2 2
1 2
v
u = cos
v v
=
+
2
2
2 2
1 2
v
u = sin
v v
=
+
1
1
2
v
tan
v

=
2 2
1 2
v v +
Q: what if the coordinate rotates
u degree?
Orthonormal Vectors and Matrices
Unity Vector
An nx1 unity vector e
i
has all its element equal to zero except the
i
th
element which equals to unity. That is,


An identity matrix I has its columns (rows) arranged as a
combination of nx1 unit vector as
I = [e
1
e
2
e
n
]
T
i j
1, i = j
=
0, i j
u u

U
T
U = UU
T
= I


U
T
= U
-1

T
T
i
i
= 0, 0.... 0, 1, 0, ...0 e
(

For an arbitrary non-zero vector v, we can always find an
orthonormal matrix U such that

U
T
v = o e
i
o is the norm of vector v.
Example
v = a
1
e
1
+ a
2
e
2
+ . a
n
e
n

1 2
2 2 2 2
1 2 1 2
1 2
2 1 2 1
2 2 2 2
1 2 1 2
v v
v +v v +v
u -u cos -sin
= =
u u v v sin cos
v +v v +v
U
(

(
( (
(
=
( (
(

(
(

1 2
1 2
2 2 2 2
1 2 1 2
T
1
2 1
1 2
2 2 2 2
1 2 1 2
v v
v v
v +v v +v
1

v v 0
v v
v +v v +v
U v e

+


= = =
` `
)

+

)
1 2
1 2
T 2 2 2 2
1 2 1 2
1 2 1
2
2 1 2 2 1
1 2
2 2 2 2
1 2 1 2
v v
v v
v +v v +v
u u v 0
=
-u u v v v 1
v v
v +v v +v
e


(
= =
` ` `
(
) )

+

)
x' cos -sin x
=
y' sin cos y
(
` `
(
) )
Decoupling of Monic 2-DOF system,
Principal Axes
2 1 1 2 T
U U U U

O = O = M
2 T T T -1
, , U U U U U U O = = = I
1 2
2 1
1 2 T
2 1
2 2
T T
2 2
u -u cos -sin
=
u u sin cos
u u cos sin
=
-u u -sin cos
1 0 cos +sin cossin-sincos
=
0 1 sincos-cossin cos +sin
U
U
UU U U
( (
=
( (

( (
=
( (

( (
= =
( (

1 1
2 2
r x cos -sin
=
r x sin cos
(
` `
(
) )
Coordinate Rotation from (x
1
,x
2
) to (r
1
r
2
)
u
r
2
x
2

r
1


x
1

Decoupling of Monic 2-DOF system, Principal Axes (contd)
let x = U r (2)

Substituting (2) into (1) and pre-multiplying U
T
on both sides of the
resulted equation
0 Kx x I = +

0
x
x
x
x
2
1
2
1
=
)
`

+
)
`

K


)
`


=
)
`

2
1
2
1
r
r
cos sin
sin cos
x
x
(1)
0
r
r
cos sin
sin cos
cos sin -
sin cos
r
r
cos sin
sin cos
cos sin -
sin cos
2
1
2
1
=
)
`

+
)
`

K


0 r r
0 r r
0
r
r
0
0
r
r
2
2
n2 2
1
2
n1 1
2
1
2
n2
2
n1
2
1
= e +
= e +
=
)
`

(
(

e
e
+
)
`





1: Decoupled
2: Decoupling by a rotation with angle u
3: New coordinates (r
1
, r
2
) : since decoupled
Input from direction r
1
does not have output
along direction r
2
Principle axes
4: 2DOF principal axes always exist

Decoupling of Monic M-DOF system,
Principal Axes
1. Direction (1-2): Principal Axes
2. Generally a matrix K cannot be decoupled as above, (such U does
not exist)
3. But there always exists orthogonal matrix U for symmetric K such
that U
T
KU = O
2
if i j e
i
e
j
K is always decouplable

(

=
(

=
=
(


=
22
11
22 21
12 11 T
nxn
T
0
0

1/2n) x (1/2n :
cos sin
sin cos
k
K
k K
K K
K
I
I I
I I
U U
I U U
U
Decoupling of Monic M-DOF Systems
0 Kx x I = +

let x = U r where U
T
KU = O
2


0 r r = +
2
O

0 r r
0 r r
0
r
r
...
r
...
r
n
2
nn n
1
2
n1 1
n
1
2
n2
2
n1
n
1
= e +
= e +
=

(
(
(

e
e
+





...
...
Vector Space
a(u + v) = au +av

(a+b) u = au + bu

a(bu)=(ab)u

1u = u

u and v are nx1 vectors, so will be w = a u +b v

Q1: In n x 1 vector space, how many vectors exists
Q2: using the sum of vector a
1
u
1
, a
2
u
2
, to represent a vector
w
nx1 ,
how many of them are needed?
Linear Independence
Suppose we have two nx1 vectors v and u. If
v = b u

A set of nx1 vectors u
i
is linearly dependent if there exists n scalars b
1
, b
2
,
b
n
that are not all zero, such that
b
1
u
1
+ b
2
u
2
+ . b
n
u
n
= 0

where 0 is a null nx1 vector. If the above equation does not hold, then the set
of u
i
is linearly independent.

The linear combination of all the u
i
s will be a nonzero vector v, that is
b
1
u
1
+ b
2
u
2
+ . b
n
u
n
= v

b
1
e
1
+ b
2
e
2
+ . b
n
e
n
0 not hold.
b
1
e
1
+ b
2
e
2
+ . b
n
e
n
0 (linearly independent)
1.
[e
1
e
2
e
n
]

2. rank(A) = n
3. det(A) 0

4. A is non-singular
5. A
-1
exists
6. All column (row) vectors of A are linearly independent

1 1 1
2 2 2
n n n
b b b
b b b
... ... ...
b b b
0



= = =
` ` `


) ) )
I
vector v = [v
1
, v
2
, v
n
]
T


[u
1
u
2
u
n
]
1 1
2 2
n n
b v
b v
... ...
b v



=
` `


) )
[u
1
u
2
u
n
]
-1

1
2
n
b
b
...
b



=
`


)
1
2
n
v
v
...
v



`


)
Q1: Again, What is the condition that
[(.)]
-1
exists?

Q2: What is the essence of u
i
and u
j
to
be linearly independent?

Q3: Linearly independent and
orthogonal, which one is more
important?
Natural Frequency and Mode
shape
From
Generally, e
ni
u
i
Degenerate modes, e
ni
u
i,,
, u
j,
(u
i,,
u
j,
)

Rigid-body motion, e
n1
= 0 u
1
Frequency
Scalar
Easy to measure
Mode shapes
Vector
Spatial aliasing




2 1
ni i i
( ) u u

= M
Natural frequency
and
Mode shape
Mode Shape and Eigenvectors
Essence of eigenvector
Vector vs. Eigenvector, concept of direction
Why do we have eigenvector?
Eigenvalue vs. Eigenvector
General cases
Non-uniqueness of eigenvectors
Mode Shape Functions
Proportionally damped system eigenvector = mode shape function
Special Mode Shapes
Example of mode shapes
Q: hammer test, which locations
should we use?
Supporting points
Q: Mode shape: what kind can you imagine?
Example of mode shapes

X
11 ,
X
21
X
31
X
41
X
51
X
61
X
71

Rigid Body Motion
1
= 0
Repeated Eigenvalues, |
i
= |
j
, e
i
= e
j
Special Mode Shapes
Special Mode Shapes
Local mode

Ford GT
Local mode (contd)
Roll over of a car

Centripetal
Acceleration
Debris fence
Penstock
Hydroelectric dam

Potrebbero piacerti anche