Documenti di Didattica
Documenti di Professioni
Documenti di Cultura
JAYANTA BISWAS
Department of Mathematics, Barasat Government College, Kolkata, West Bengal, India
ABSTRACT
We have lots of properties of traditional matrix algebra. Some of these properties are studied and extended in
[1-5]. Here we extend some more properties in extended matrix algebra on M (F), the set of all matrices over a given field
F [1, 3]. Also following [1-11], we are motivated to introduce some new properties in extended matrix algebra on M (F).
Notations
(iv) × denotes the × matrix in , of which all the entries are zero.
(v) If × = ×
∈ and p, q are positive integers such that ≤ , ≤ , then × = ×
.
1. INTRODUCTION
Matrix algebra and Linear spaces of linear transformations of vector spaces (considering linear transformations as
matrices) are extended in [1, 3] as:
, '0 1 ≤ ' ≤ , 1 ≤ - ≤ 7
.
=/ , for ' = 1,2, … . , ; - = 1,2, … . , &
0 , 23ℎ5 6'&5
where
, '0 1 ≤ ' ≤ $ , 1 ≤ - ≤ 7
.
=/ , for ' = 1,2, … . , ; - = 1,2, … . , &.
0 , 23ℎ5 6'&5
and
= ×
, = ×
∈ = '- ×
Define ‘multiplication’ of matrices in by for all , ,
www.iaset.us editor@iaset.us
84 Jayanta Biswas
1 , '0 ' = -7
, ∈ ℕ, we define @ as @ = A , where for ' = 1,2, … , ; - = 1, 2, … . , , A = / .
× × × 0 , '0 ' ≠ -
For
2 1 0 −1 4 1 3
0 1
=C 0 5 2 3 0 I and = J−1 2K .
−1 3 −5 7 2 2 1
Example (1.1): Let
3 4 0 −1 4
0 6
0 6 2 3 0 K and AB =C
+ =J 4 12 I.
−2 5 −5 7 2
18 −3
0 0 0
Then
2 1
Let R be a non-empty set on which two binary operations "addition and "multiplication are defined. Then the algebraic
structure (R, +,. ) is said to be a weak hemi-ring if
(i) (R, +) is a commutative monoid, the identity element is called 'zero', denoted by 0. (ii) (R,. ) is a semi-group.
The algebraic structure (M(F), +, .) is a weak hemi-ring having zero >×> = 0 >×> .
@
, ∈ ℕ and = ×
∈ be arbitrary. Let @O , =P Q , where × ∈
×
Let is arbitrary and
Now it can be easily proved that for given positive integers m, n, for all × ∈ ,
× @ × =@ × × = × iff m = n .
× × = × × =@ × iff × × = × × = @ and
[ . × × =@ and × \> , \] , … . . , × U
= × U , if ≤ $, < ; where for
N \>
`N \] c
× × = @ and _… … .b × = U × ≤ $, > ;
…….
(ii). , if
^N a U ×
N \>
`N \] c
× × = @ and _… … .b × = U × > $, > .
…….
(iv). , if
^N a U ×
www.iaset.us editor@iaset.us
86 Jayanta Biswas
, ∈ + = +
e e e
For any two matrices × × , × × × ×
, ∈ =
e e e
For any two matrices × × , × × × ×
Let × = ×
∈ . If ≤ , then for ' = 1,2, … , ;
If > , then for - = 1,2, … , ; ' = -, - + 1, … , - + − , ′& are called the diagonal elements of × .
In each case, the portion of × , formed by these diagonal elements is called the diagonal of × .
All the elements of × , other than the diagonal elements, are called the non-diagonal elements of × .
(a) A matrix × = ×
∈ is said to be symmetric if, when ≤ , then for
' = 2,3, … , ; - = 1,2, … . , − 1, if ' > - then = \ U and when ≥ , then for
(b) A matrix × = ×
∈ is said to be skew-symmetric if all the diagonal elements of × are
zero and when ≤ , then for ' = 2,3, … , ; - = 1,2, … . , − 1 , if ' > - then =− \ U and when ≥ ,
then for - = 2,3, … , ; ' = 1,2, … . , − 1 , if ' < - then =− \ U .
(c) A matrix × = ×
∈ is said to be weak skew-symmetric if, when ≤ , then for ' =
2,3, … , ; - = 1,2, … . , − 1 , if ' > - then =− \ U and when ≥ , then for - = 2, 3, … , ; - = 1,2, … . , −
1 , if ' < - then =− \ U .
Example (1.2)
k 1 −3 g 1 −3
g h 0 −1 k k 4 k i 4
=C 0 i g 3 I, =`k k k c, C = ` h k l c, the bold elements are
−1 3 −i j −1 k k −1 −l m
In the real matrices
^3 −4 ka ^3 −4 ha
diagonal elements and the non-bold elements are non-diagonal elements. Also, A is symmetric, B is skew-symmetric and C
is weak skew-symmetric.
(iii). If , be weak skew-symmetric, then the matrix + is not, in general, weak skew-symmetric.
(iv). if A, B be two symmetric (or skew-symmetric) matrices of the same order such that AB = BA, then AB may
not be a symmetric matrix (or skew-symmetric).
For any square symmetric matrix A and any matrix P in , ne n is a symmetric matrix; but the result fails to
be hold good if A be a non-square symmetric matrix.
≠ + + −
> e > e
× ] × × ] × × , as the matrix on the right hand side is of order
max# , % × max# , % ≠ , , provided Char(F) ≠ 2. Therefore a natural question arises. Is it possible to express a
matrix over a field as a sum of a symmetric matrix and a skew-symmetric matrix? The answer to this question is
affirmative, given by means of the following proposition.
Proof
Let = ×
∈ be arbitrary. If = , then the result is obvious, as
Since < and B, C are symmetric and skew-symmetric matrices, respectively, hence the diagonal elements of
B and A are same (since the diagonal elements of C are zero). Thus the diagonal elements of B are determined.
Now for the non-diagonal elements of B and C we have for ' = 2, 3, … . , ; - = 1, 2, … . , − 1, if ' > -,
then = \ U (3)
and =− \ U (4)
www.iaset.us editor@iaset.us
88 Jayanta Biswas
= 2U> − \ U (8)
\ U = = 2U> + \ U (9)
(by (7) )
\ U =− = −2U> − \ U (10)
(by (8) )
e
If > , then similarly, we have p ×
q =r × + × (11)
To prove that the expression is not unique, it is sufficient to consider some examples. Consider the matrix
1 1 3 4
= J 4 2 1 3K . Then = + + −
> e > e
−3 0 4 0 ] ]
2 1 −2 0
(13)
+ −
> e > e
] ]
and is symmetric and is skew-symmetric .
1 1 2 0 0 1 4
=J 5 2 1 K + C−1 0 0 3I
1 3 4 0
−4 −3 0
2 1 −2
Again, we see that (14)
And the first matrix of right hand side of (14) is symmetric and the second one is skew-symmetric.
2 1 −1 u 0 0 0 >
2 1 −1 −1 5 −
t w
= s− 3 2 > v + s− 0 0
t ] w ]
= C−6 3 2 5 0I . Then >v
] ]
]
5 − ] 0
4 −1 3 2 6 u
−
>
3 2
]
−
>
−
>
0 0
]
(15)
] ] 6 ] ] 0
And the first matrix of right hand side of (15) is symmetric and the second one is skew-symmetric.
2 1 −5 −2 5 0 0 4 1
= C−2 3 2 4 0I + C−4 0 0 1I
5 0 3 2 6 −1 −1 0 0
Again (16)
And the first matrix of right hand side of (16) is symmetric and the second one is skew-symmetric.
From the proof of theorem (1.12) we observed that any × matrix over a field F can be expressed as the sum
of a symmetric and a skew-symmetric matrix of the same order. Therefore a natural question arises. Is it always possible to
express a matrix × as × = × +o × , where , , $, , , & are given positive integers such that = max #$, %
and = max # , &% and × is symmetric and o × is skew-symmetric and $, ≠ , & ? The answer to this question
is given by means of the following proposition.
× as × = × +o ×
is symmetric and o
Again from theorem (1.12) it is observed that, if we wish to express only, where
× × is skew-symmetric, then this expression unique.
= max # , &% and $, ≠ , & . Then × can be expressed as sum of a symmetric matrix of order $ ×
and a skew-symmetric matrix of order × & if (i) $ ≥ ≥&≥ or (ii) ≥$≥ ≥ & or
And this expression is not possible, in general, in other cases, i.e., if (v) $ ≥ ≥&≥ or
www.iaset.us editor@iaset.us
90 Jayanta Biswas
ℕ, @ + =@ + 29 − 1
9
× × × × .
The trace of a matrix × ∈ , denoted by tr × , is defined as the sum of all diagonal elements of × .
(i) tr × + × ≠3 × +3 × , in general.
(ii) tr × × ≠3 × × , in general.
× ∈ × ∈ × × =
=@
Again in [4], it is stated that, for a given , if there exists , such that
U>
× × × , then × is unique and we call × as the inverse of × , and denoted by × .
€
U>
× × U , '0 <
~
7
U>
= U>
= P Q , '0 >
•
× ×
~
U ×
Clearly
}
U>
, '0 =
Again in [4] Sherman-Morrison Rank One Update Formula and Sherman-Morrison-Woodbury Formula [8] in
traditional matrix algebra are extended in extended matrix algebra on .
>
o = `… .c
]
, … = T> T] …… T 0 0 … . . 0 ∈
….
>× .
^ a ×>
>
`….c
]
_….b
_ b
o=_ b , … = T> T] ….. T
0
>×
_ b
_ 0 b
..
…
… .. ..
^ 0 a ×>
>
o = `… .c
]
, … = T> T] …… T 0 0 … . . 0 ∈
….
>× .
^ a ×>
… U>
× o ≠ 0.
>
`… .c
]
_… .b
, ∈ ℕ with m > × =• × Ž,o =_ b ,
U ×
_…0 .b
Corollary (1.20): Let and
….
^0a ×>
+ o… = − + o…
U> U> ˆ‹Œ ‹Œ
‰×Š †‡ˆ‰×Š U> U>
× × >\‡ˆ‹Œ × ×
‰×Š †
Then , provided and both exist and
1+… U>
× o ≠ 0.
o= , … = T 9×
9× U ∈
×9 9×
.
× + o… U>
= U>
× − U>
× o @9 + … U>
× o U>
… U>
× × + o… U> U>
×
@9 + … o
Then , provided , and
U> U>
× exist .
… = T 9×
∈ . Then × + o… U>
= U>
× − U>
× o @9 + … U>
× o U>
… U>
× , provided × + o… U>
,
U>
× and @9 + … U>
× o U>
exist
2. MAIN RESULTS
Theorem (2.1)
(i) 3 × + × =3 × +3 × .
www.iaset.us editor@iaset.us
92 Jayanta Biswas
(ii) 3 × × ≠3 × × , in general.
(iii) 3 × o × =3 o × × .
(iv) 3 × +o × ≠3 × +3 o × , in general.
Proof
Let × = ×
, × = ×
,o × = ×
.
(i) We have × + × = T ×
, where
From (1) it is clear that the diagonal elements of × + × are the sum of the corresponding diagonal
elements of × and × .
Hence 3 × + × =3 × +3 × .
1 0 −1 2 2 3 1 0
= C−2 2 1 0 I and = C 3 0 −2 2I .
3 1 2 4 −1 2 4 3
(ii) Consider the real matrices
3 1 −3 −3 −1 7 3 8
= C1 −4 −2 7 I and = C−3 −2 −7 −2I .
7 13 9 8 7 8 11 14
Then
(iii) We have × o × = 5 ×
, where
Again o × × = 0 ×
, where
( by (19) ) .
Therefore × o × =3 o × × .
1 0 2 1 3
−1 2
= C−2 1 3 4K .
2 1 0 I and = J5 2 1
3 1 2 4 2 4 3
(iv) Consider the real matrices
3 1 2 2
Then +o = J −1 5 5 0K .
8 3 3 4
2 4 3 0
Now 3 = 1 + 2 + 2 + 0 + 1 + 4 = 10 , 3 o = 2 + 3 + 1 + 1 + 2 + 3 = 12 and
Definition (2.1)
A matrix × ∈ is said to be a diagonal matrix if its non-diagonal elements are all zero.
Theorem (2.2)
If , ∈ be of the same order such that A is a diagonal matrix with no two diagonal elements are equal, and
B commutes with A, then B may not be a diagonal matrix.
Proof
−4 4 −1 0 0
=C 0 −3 3 0 0 I. Clearly, all the diagonal elements of A are distinct.
0 0 1 −2 2
Consider the real matrix
−1 −4 4
•‘ U>]
−9 2 −2 −9 2 −2
U]t] ’“‘ U]t] ’“‘
We know that, for any positive integral power of a square symmetric matrix is symmetric; any positive even
integral power of a square skew-symmetric matrix is symmetric and any positive odd integral power of a square
skew-symmetric matrix is skew-symmetric. Let us see whether this result holds good in extended matrix algebra.
Theorem (2.3)
(i) Any positive integral power of a symmetric matrix in may not be symmetric.
(ii) Any positive even integral power of a skew-symmetric matrix in may not be symmetric and positive
odd integral power of a skew-symmetric matrix in may not be skewed-symmetric.
Proof
1 2 3 4 5
(i) For example, consider the real symmetric matrix = C4 −1 2 0 3I
5 3 1 −2 4
www.iaset.us editor@iaset.us
94 Jayanta Biswas
24 9 10 −2 23
]
= C10 15 12 12 25 I Which is not symmetric.
22 10 22 18 38
Then
0 0 3 4 5
(ii) Consider the real skew-symmetric matrix = J −3 0 0 −6 7K .
−4 6 0 0 −8
−5 −7 8 0 0
Definition (2.2)
@ −
”=P × × ×
Q is involuntary. In theorem (2.4) we shall study about this
2 × − × × × × × −@
property in extended matrix algebra.
Theorem (2.4)
@ −
”=P Q may not be involuntary.
× × × ×
2 × − × × × × × −@ ×
Proof
−1 0 1 1 2 3
=C2 −3 2 0K .
3 −1I , =J
’×’ •×’
1 −1 2
0 −2 1 3 2 1
Consider two real matrices
−2 0 −2 1 2 3
−7 −5 5 −3 2 0
` c
3 7 −3 1 −1 2
=_
_ −1 −4 −2 3 2 1b
b is not involuntary.
_ 4 7 0 −1 −3 −1b
−26 −19 13 −8 10 4
^ 17 15 −12 7 −5 1 a
Note (2.1)
We know in traditional matrix algebra that, if a square matrix A over a field F of characteristic zero, be an
orthogonal matrix, i.e., if e
= @, then e
= e
= @, and the set of row vectors and the set of column vectors of A are
orthogonal sets. Here we introduce the definition (2.3) of right orthogonal and left orthogonal matrices and the in theorem
(2.5) we study about properties of orthogonal matrices in extended matrix algebra.
Definition (2.3)
Theorem (2.5)
(i) A right orthogonal matrix in may not be left orthogonal and a left orthogonal matrix in may not
be right orthogonal.
(ii) If a matrix × ∈ be right orthogonal, then the set of row vectors of × is an orthogonal set of
vectors over the field F. And if × be left orthogonal, then the set of column vectors of × is an orthogonal set of
vectors over the field F.
× ∈ , + , . ;
, + , . .
(iv) If a matrix be right orthogonal then it is a regular element in the weak hemi-ring
and if it is left orthogonal then it is a regular element in the weak hemi-ring
Proof
Trivial.
Note (2.2)
We know in traditional matrix algebra that, given a system of n linear equations in n unknowns such that the
coefficient matrix is non-singular, then the system is consistent and has a unique solution. Also, we know that, given a
system of m linear equations in n unknowns such that the rank of the coefficient matrix is equal to the rank of the
augmented matrix, then the system is consistent. In theorem (2.6) and corollary (2.6) we shall study about this property in
extended matrix algebra.
Theorem (2.6)
>
—>
` ]c —
` ]c
= , = _… .b ∈ and – = _… .b be an unknown matrix over the field F and
× × ×>
….
×>
….
Let
—
^ a ^ a
char(F ) = 0 . Also, let there exists o × = ×
∈ such that o × × =@ × .
Then (i) (23) is consistent and has a unique solution if either = = $ or $ = < .
(ii) (23) is consistent and has many solutions if either = < $ or < = $ or < < $ or < $ < .
www.iaset.us editor@iaset.us
96 Jayanta Biswas
(iii) (23) is consistent and has a unique solution if either = > $ or $ < < or > = $ or > > $,
T>
T]
`… .c
_… .b
_ b
provided in each of the cases, o is of the form _ T b
_0b
× ×> ; otherwise (23) is inconsistent.
_0b
…
… ..
^0a ×>
T]
`… .c
_… .b
_ b
the cases, o is of the form _T b
_0b
× ×> ; otherwise (23) is inconsistent.
_0b
…
… ..
^0a ×>
Proof
From (23) we get o × ˜ × – ×> ™ =o × ×> . This implies that #o × × %– ×> =… ×> … (24)
T>
T] c
`…
where … = _ .b and for ' = 1, 2, … . , , T = ∑9=> 9 9 .
:;< # , %
….
×>
^T a
Now considering several cases, the remaining part of the proof is just now routine check.
>
—>
` ]c ` —] c
= , = _… .b ∈ and – = _… .b be an unknown matrix over
× × ×>
….
×>
….
Corollary (2.6): Let
—
^ a ^ a
the field F and char(F ) = 0 . Then the system of linear equations × – ×> = ×> is consistent iff rank of × = rank
of ̅ × = × ×> .
Definition (2.4)
× ∈ and char(F) = 0. Let – ×> be an unknown vector. If there exists › ∈ such that the system of
equations × − ›@ × – ×> = ×> has a non-zero solution in , then › is called an eigen
Theorem (2.7)
Proof
Trivial.
CONCLUSIONS
REFERENCES
1. Jayanta Biswas, Extension of Matrix Algebra, Oral Presentation, The 9th International IMBIC Conference MSAST
2015, Kolkata, December 21-23, 2015; Abstracts, Volume 4, Page-279, ISBN 978-3-5, 2015.
2. Jaya Biswas, Some properties of Extended Matrix Algebra, Oral Presentation, National Seminar on Recent
Developments In Mathematics And Its Applications (NSRDMA-2016), Department of Mathematics, University
of Kalyani, January 21-22, 2016; Abstracts, Page-40-41.
3. Jayanta Biswas, Extension of Matrix Algebra and Linear spaces of Linear Transformations, International Journal
of Novelty Research in Physics, Chemistry and Mathematics, ISSN-2394-9651, Vol-3, Issue 1, pp: 29-42, 2016.
4. Jayanta Biswas, Sherman-Morrison Rank One Update Formula and Sherman-Morrison-Woodbury Formula in
Extended Matrix Algebra, Oral Presentation, National Seminar On Recent Advances In Mathematics And Its
Applications (RAMA-2016), Department of Pure Mathematics, University of Calcutta, February 24-25, 2016;
Abstracts, Page 31-32.
5. Jayanta Biswas, Some More properties of Extended Matrix Algebra, Oral Presentation, National Seminar on
Analysis And Applications: Celebrating 100 years of the General Theory of Relativity, Department of
Mathematics, West Bengal State University, March 10-11, 2016; Abstracts, Page- 17-18.
8. Carl D. Meyer, Matrix Analysis and Applied Linear Algebra, Chapter 3: Matrix Algebra, pp 79-158, 2004.
9. E. H. Connell, Elements of Abstract And Linear Algebra, Chapter 4: Matrices and Matrix Rings, pp 53-64, 2004.
10. David C. Lay, Linear Algebra And Its Applications, Chapter 2: Matrix Algebra, pp 91-162, 4th Edition.
11. Robert A. Beezer, A First Course in Linear Algebra, Chapter M: Matrices, pp 197-302, Version 3.40.
www.iaset.us editor@iaset.us