Sei sulla pagina 1di 5

Problem Statement: In this paper consider the case when the receiver performs two

measurements y1(n) and y2(n), n = 0,1,2,3 . . . . ., for each transmitted symbol s(n), i.e.,
y1(n) = B (q-1) * s (i) + w1(n)
y2(n) = C (q-1) * s (i) + w2(n) - - - - - - - - eq(1)
for n 0. Here, q-1 is a unit delay operator, while B (q-1) and C (q-1) are transfer operators of the
two parallel channels, and are given by
B (q-1) = b0 + b1q-1 + ...+ bLq-L
C (q-1) = c0 + c1q-1 + ...+ cLq-L

- - - - - - - eq(2)

where L 0 is the channel order. That w 1(n) and w2(n) are the respective receiver noise
sequences. The objective is to identify the unknown parameters of B (q -1) and C (q-1) based only
on the noisy measurements y1(n) and y2(n), n 0. In general all quantities of y1(n) and y2(n) are
complex numbers. The following definition is useful for characterization of random sequences
{s(n)} and {wk(n)}, k = 1,2,. . . ,n 0.
Definition 1: By considering w1(n) = wR(n)+ j* wI(n), j2 = -1, where {wR(n)} and {wI(n)} are real
sequences. We say that {w(n)} is a complex martingle difference sequence if both {w R(n)} and
{wI(n)} are martingale difference sequences, i.e.,
E{wR(n+1)/Fn} = 0
E{wI(n+1)/Fn} = 0

(almost sure) - - - - - - - - eq(3)

where Fn is the - algebra defined by Fn = { wR(0), wR(1), . . . . . . . wR(n), wI(0), wI(1), . . . . . ,


wI(n)}.
We then write E{w(n+1)/Fn'} = 0 (almost sure) with Fn' ={w(0),w(1), w(2) . . . . . . ,w(n)}.

In the sequel, the following is assumed about the system model(1).


Assumption 1:
(i) {s(n) } is a martingale difference sequence, i.e., E{s(n+1)/Fn} = 0 (almost sure), satisfying
E{s(n+1)s(n+1)*Fn} = s2

(almost sure) - - - - - - - - eq(4)

and
t

lim

2
1
|s (n)| = s2 (almost sure)

m n =1

- - - - - - - - eq(5)

where Fn = {s(0), . . . . . . . ,s(n)}.


(ii) {w1(n)}and {w2(n)} are martingale difference sequences satisfying
E{wk(n+1) wk(n+1)*/Fkn} = w2 (almost sure) - - - - - - - - eq(6)
t

2
1
lim |w k (n)| = w2 (almost sure) k = 1,2
n m n =1

- - eq(7)

where Fkn = { wk(0), . . . . . . . . , wk(n)}.


(iii) {s(n)}, w1(n)}and {w2(n)} are mutually independent sequences.
Assumption 2: B(q-1) and C(q-1) are co-prime polynomials. without loss of generality we assume
that all signals s(n), w1(n) and w2(n) are zero for n < 0. Note that Assumption 1 does not require
{s(n)} and {wk(n)}, k = 1,2, to be independent identical distribution sequence. Assumption 2 is
standard in the identification literature.
Observe that from (4) we obtain
E[s(n)s(n+l)*] = E{ E[s(n)s(n+l)*/Fn]}
= E{s(n)E[s(n+l)*/Fn]} = 0, l 1 . . . . . . . eq(8)

i.e., samples of {s(n)} are uncorrelated. Similar conclusion can be derived for {wk(n)}, k = 1,2,
i.e.,
E{wk(n)wk(m)} = 0

. . . . . . eq(9)

for all n m.
RECURSIVE PARAMETER ESTIMATION
By considering eq(1)
C(q-1)*x1(n) = B(q-1)*x2(n)

. . . . . . eq(10)

where
x1(n) = y1(n) - w1(n) = B(q-1)*s(n)
x2(n) = y2(n) - w2(n) = C(q-1)*s(n)

. . . . . . eq(11)

Eq(10) can be written in the form


x2(n) = T*x(n)

. . . . . . eq(12)

where and x(n) can be define as


* = 1/b0[c0, c1, . . . . . . , cL, 0, . . . . . 0, b1, b2, . . . . . , bL, 0, . . . . . . . ,0]
x(n)T = [x1(n), . . . . . , x1(n-n1), - x2(n-1), . . . . . ,- x2(n-n2)]

. . . . . . eq(13)
. . . . . . eq(14)

Substituting eq(12) into eq(11), gives


y2(n) = T*x(n) + w2(n)

. . . . . . eq(15)

Now we can obtain , an estimate of by minimizing the following cost function :


I = E(e(n)e(n)*), e(n) = y2(n)
Differentiation with respect to

* x(n) . . . . . . eq(16)

yields

E(x(n)*y2(n)*) = E(x(n)* x(n)*)* .

. . . . . . eq(17)

Since x(n) is measurable, it is replaces with


x(n) = (n) - w(n)
where

. . . . . . eq(18)

w(n)T = [w1(n), . . . . . . ,w1(n-n1), -w2(n-1), . . . . . , -w2(n-n2)]

. . . . . . eq(19)

By using
E(x(n)*y2(n)*) = E(x(n)*y2(n)*) - E(w(n)*y2(n)*)
= E((n)*y2(n)*)

. . . . . . eq(20)

where we have used the fact that fact Assumption 1, (9) and (13)
E(w(n)*y2(n)*) = E(w(n)*(x(n)** + w2(n)*)] = 0
By using a similar argument, we also get
E(x(n)*(x(n)*) = E(x(n)*(x(n)*) - w2*I

. . . . . . eq(21)

thus, substituting (20) and (21) into (17), we obtain

E(x(n)*(x(n)*) = (E(x(n)*(x(n)*) - w2*I )*

. . . . . . eq(21)

Replacing the expectations in (21) with respective sample averages gives

(k ) y 2 (k )
n
1

n k=1

(k ) y 2 (k ) w I
n
1

=
n k=1

* (n)

. . . . . . eq(22)

where is an estimations of based on the observations up to time n. Let


(n)=

( 1n p(n)

w 2I

* (n) . . . . . . eq(23)

(k ) y 2 (k )
p (i)-1 =

. . . . . . eq(24)

k=1

then from (22) it follows that

( k ) y 2 (k )
n
1

n
k=1
which can written in the form

= (n)

(n) =

n1
n

* (n-1) +

( k ) y2 (k )
n

. . . . . . eq(25)

Potrebbero piacerti anche