Sei sulla pagina 1di 37
RANDOM PROCESSES ODUCTION 1s for random message signals and noise encountered in communication systems are developed in er. Random signals cannot be explicitly described prior to their occurrence, and noises cannot be by deterministic functions of time. However, when observed over a long period, a random signal may exhibit certain regularities that can be described in terms of probabilities and statistical uch a model, in the form of a probabilistic description of a collection of functions of times, is andom process. IONS AND NOTATIONS OF RANDOM PROCESSES a random experiment with outcomes }. and a sample space S. If to every outcome 4 € S we -valued time function X(t, 2), we create a random (or stochastic) process. A random process refore a function of two parameters, the time f and the outcome 2.. For a specific 2, say, 2, we time function X(t, 2,) = x(t). This time function is called a sample function. The totality of all ions is called an ensemble, For a specific time 1, X(jy A) =X; denotes a random variable, 4) and fixed A= 4), X(G,4) = (6) is a number. indom process is sometimes defined as a family of random variables indexed by the parameter is called the index ser. | illustrates the concepts of the sample space of the random experiment, outcomes of the sociated sample functions, and random variables resulting from taking two measurements of ctions. ywing we use the notation X(t) to represent X(¢, 2). 'S OF RANDOM PROCESSES Expressions: random process X(t). For a particular time f,, X(¢,) =X; is a rancom variable, and its tion Fy(%;t) is defined as Fy(xist,) = PAX(,) a» Then (7.65) 2m apt ‘And Sye(@) and Ryx(2) of band-limited white noise are shown in Fig. 7-4. Note that the term white or band-limited white refers to the spectral shape of the process X(t) only, and these terms do not imply that the distribution associated with X(*) is gaussian | Sets) Regt) a oe z Ter Owe = 5 o Fig. 7-4 Band-limited white noise D. Narrowband Random Process: ‘Suppose that X(°) is a WSS process with zero mean and its power spectral density Syx(c) is nonzero only in some narrow frequency band of width 2W that is very small compared to a center frequency ey as shown in Fig. 7-5. Then the process X(t) is called a narrowband random process. In many communication systems, a narrowband process (or noise) is produced when white noise (or broadband noise) is passed through a narrowband linear filter. When a sample function of the narrowband process is viewed on an oscilloscope, the observed waveform appears as a sinusoid of random amplitude and phase. For this reason, the narrowband noise X(t) is conveniently represented by the expression X() = V(Ne0s [act + $O) (7.66) CHAP. 7] RANDOM PROCESSES 175 Sxxlo) 0 cS . om w Fig. 7.5. Narrowband random process where Vit) and (0) are random processes that are called the envelope function and phase function, respectively. Equation (7.66) can be rewritten as X() = Vi eos H(0) cos «1 ~ V(sin P(A)sin «wt (7.67) = X-(0)08 cet — X,()sin wet where X,(9 = Videos (0) (in-phase component) (7.68a) X,() = Vsin G2) (quadrature component) (7.686) Vin = yX20) + X20) (7.694) = tan SO P(t) = tan’ XO (7.69b) Equation (7.67) is called the quadrature representation of X(t). Given X(t), its in-phase component X,(t) and quadrature component X,(t) can be obtained by using the arrangement shown in Fig. 7-6. Low-pass “0 fiter ce xo) 208 ot Low-pass xe filter ss ~2sin wet Fig. 7-6 Properties of X(@) and X,(0: 1, X:(# and X,(0) have identical power spectral densities related to that of X(2) by §. (ow) = S¢(@) = {S(O — Oe) + Sx $0.) lol = W Sx,(@) = (0) = {9 otherwise ae 176 RANDOM PROCESSES (CHAP. 7 2. X.(0 and X,(1) have the same means and variances as X(t) Hy, = Hx, = Hx = 0 (7.71) OR, = 0%, = OF (7.72) 3. X.(0) and X,(#) are uncorrelated with each other: ELX(OX,()] = 0 (7.73) 4. IF X() is gaussian, then X.(0) and X,(0) are also gaussian, 5. IFX(0) is gaussian, then for fixed but arbitrary 1, V(0) isa random variable with Rayleigh distribution and ‘(0 is a random variable uniformly distributed over {0,22}. (See Prob. 6.39.) Solved Problems RANDOM PROCESSES AND STATISTICS OF RANDOM PROCESSES 7.1. Consider a random process X(#) given by xO where A and co are constants and @ is a uniform random variable over [~x, 7]. Show that X(2) is WSS. \cos (wt + ©) (7.74) From Eq, (6.105) (Prob. 6.23), we have a <0= fa) =} 2n “0 =" 0 otherwise Thus, 1x60 = EIXCO= J Acos or + Bo(®)a AP swurehatno ea Rutt) = EKOXO +O) re Hf cm ort conten 9 + oat Are HEP Meas or co aor20+ woo a = cos wt (7.76) Since the mean of X() is a constant and the autocorrelation of X(t) is a function of time difference only, we conclude that X(0) is WSS. [Note that Ryx(2) is periodic with the period To = 2n/«. A WSS random process is called periodic if its autocorrelation is periodic. 7.2. Consider a random process X() given by X(t) = Acos (cot + 0) (7.77) where © and @ are constants and A is a random variable. Determine whether X(t) is WSS. cHap. 7] RANDOM PROCESSES 7 E(X(t)] = ElAcos (wt + 8)) 08 (ot + EIA] (7.78) Which indicates thatthe mean of X() is not constant unless EA] = 0. Ret +) = KON + 9) = E{AP cos (wt + O) cos foxt +2) + ON) Moos wt + 00s 2a +26 + x)]E[A"] (7.79) ‘Thus, we see thatthe autocorrelation of X(t) is not a function of the time difference + only, and the process X(t) is not WSS, 73. Consider a random process ¥(t) defined by ro= J) xeoas (7.80) where X(#) is given by X( = Acos oot (781) where @ is constant and A = N{0;07]. (@)_ Determine the pdf of ¥(¢) at t = 1. @) Is Y) WSS? sin cf ot @ via)= f Acos oxae = a) Then from the result of Prob. 6.32, we see that Yi.) is a gaussian random variable with EWG) = et eal =0 (783) and 784) Hence, by Eq, (6.91), the paf of Y(t) is 1 ret Si = Fe erie (785) (®) From Eqs. (7.83) and (7.84), the mean and variance of ¥(1) depend on time s(,), so Y() is not WSS. 7.4. Consider a random process X(t) given by X(#) = Acos wt + Bsin ot (7.86) where o is constant and A and B are random variables. (@) Show that the condition EIA] = E(B] = 0 (7.87) is necessary for X(t) to be stationary. () Show that X(0 is WSS if and only if the random variables A and B are uncorrelated with equal variance, that is, EUAB] 0 (7.88) 178 RANDOM PROCESSES [CHAP. 7 and E{A?] = E(B?] =o’ (7.89) (2) y(t) = BLX() = ElAloos ot + B{B]sin oot must be independent of + for X() to be stationary. This is possible only if y(t) = 0, that is, FIA) = EIB) =0 (®)_IFX@) is WSS, then from Eq. (A7.17) ax) = 2%) |= Rx) = oh =8 but xo=A ad x2 Thus, Ela") = EIB] = 0% Using the preceding result, we obtain Rll, +) = EKOXC +O] ETA cos wt + B sin w1)[A £08 ot + 2) +B sin oxt + 2) = o%cos ox + EIAB]sin 2a +1) (7.90) ‘hich will be a function of « only if E[AB] = 0. Conversely, if E(AB] = Oand £{A?] = E[B?] = o?, then from the result of part (a) and Eq. (7.90), we have Bx) = 0 Ralt,t+2) = ae0s or = Ryx(2) Hence, X(0) is WSS. 7.5. A random process X(t) is said to be covariance-stationary if the covariance of X(t) depends only on the time difference t = fy ~1,, that is, Calt,t + = Cx 791) Let X(t) be given by X() = (A+ Deos t+ Bsin t where A and B are independent random variables for which FIA] = E[B]=0 and FTA’) = E{B*}=1 Show dat X() is uot WSS, but it is covariance-stationary. ELX(0)] = E[(A + 1) 008 ¢+ Bsin )) BUA + 10s 1+ ELB)sin + = cost Bx ‘which depends on 1. Thus, X(@) cannot be WSS. Rexltys fe) = EXO IKE] = BIA + 1)e0s t, + Bsin t][(A + 1e0s #2 + Bsin 41) H+ fost t+ EBM sin sin EA + Bos sin sn 0s) CHAP. 7) RANDOM PROCESSES 179 Now afta + 0] = ea? +2841) = 2} + 2A +1 EQ(A + 1)B] = E(AB] + EIB) = E{AIELB] + E(B] = 0 EP) =1 ‘Substituting these values into the expression of Ry(t,f2), we obtain Rett td 2c0s f, cos ty + sin fy sin fy = 08 (fy) + 608 1, 608 fy From Eq, (7.9), we have Cael 2) = Ritts) — malta) = cS (f2~ 1) + E08 M1608 fy = 608 £008 fp = c0s (=) Taus, X(0) is covariance-stationary. 7.6. Show that the process X(#) defined in Eq. (7.74) (Prob. 7.1) is ergodic in both the mean and the autocorrelation, From Eq. (7.20), we have 5=600)= Jind | i Avos (one T A [ha =F [cos (or dr =0 (7.92) TJ where Ty = 2n/ From Bq. (7.21), we have Rex() = Watt + 0) = jn} me orton 49-40 = J Nos onscodton 428 on A oe oss) HG) = BIX(O = (st) = Rag (0) = ELXOXE + 1] = WOKE +9) = Rag) Hence, by definitions (7,24) and (7.25), we conclude that X(0) is ergodic in both the mean and the autocorrelation, CORRELATIONS AND POWER SPECTRA. 7.7. Show that if X(#) is WSS, then Fx +) - XP] = AR) — Rex (7.94) where Ryx(t) is the autocorrelation of X(). Using the linesrity of E (the expectation operator) and Eqs. (7.15) and (7.17), we have 180 RANDOM PROCESSES (CHAP. 7 E[X0+ 2x0 + OX] +X°0] = afta + 0] 220+ OKO EL (0) — 2Rax(2) + Real) [Rex (O) — Rex] efxe + 9-x00P 7.8. Let X() be a WSS random process. Verify Eqs. (7.26) and (7.27), that is, @ Ryx(—1) = Rex) o [Rx] = Rex(O) (@) From Bq. (7.15) Rex() = FIXOXC +91 Setting 1-2 =, we have Reels) = EX = OX) = EXOXE -2)) = Ry) ® Am *x¢+oF]=0 oe Ate =xoKe+9 +P e+0]=0 or Axo] = 22KOXe+ 0) +AXC+9]=0 or 2Ryx(0) + WRex(t) =O Hence, Ryx(O)™ [Rex 7.9. Show that the power spectrum of a (real) random process X(0) is real, and verify Eq, (7.39), that is, Sual-0) = Sx(o) From Eq, (7.36) and by expanding the exponential, we have Salo) = J" Rates = Fh Rarer orn on de 798) = fF tetoces antes” Ratonin onde Since Re(2) = Ry) I. (7-26 th iaginay term in Ey (7.95) hen vanishes and we obtain 7 (7.96) Syl) = f. Ryy(2)c0s ordre Which indicates that Se) is eal Since the cosine is an even funtion ofits arguments, that is cos (0) = 60s wt, it follows that Suto) = Sato) which indicates that the power spectrum of (0) is an even function of frequency. CHAP. 7] RANDOM PROCESSES 181 7.10, 7A. A class of modulated random signal ¥() is defined by ¥(#) = AX(t)cos (w.t + ©) (7.97) where X(0) is the random message signal and Acos (,t + @) is the cartier. The random message signal _X( is a zero-mean stationary random process with autocorrelation Ryx(t) and power spectrum Syx(0). ‘The carrier amplitude A and the frequency «, are constants, and phase © is a random variable uniformly distributed over [0,2n]. Assuming that X(t) and © are independent, find the mean, autocorrelation, and power spectrum of Y(t). yl®) = ELV) = EIAX() 0s (01 + )) AEIX(O)Efe0s (o.t + ®)] = 0 since X(t) and © are independent and EIX()] = 0. Ryltst +2) = EYOME+ 9] = BAX OX + 100s (oa + eos [o.(t-+) + OT] Faxoxe + DIE le0s w-t +005 (200,t + «ct + 20)] & Re)c08 cnt = Rl) 738) Since the mean of ¥(A is a constant and the autocorrelation of Y(0) depends only on the time difference r, ¥(d) is WSS. Thus, Srr(0) = F IRyO) = FF Rex(2)e08 04] By Eqs. (7.36) and (/.76) FRx) = Sl) F (008 st) = 1500 ~ 0) + AB + 0.) “Then, using the frequency convolution theorem (1.29) and Eq, (1.36), we obtain Sm(0) = £ Sq(0) # (rd(@- 04) + FO +0) A iSqo-0,) + Salo+ 0) 79) Consider a random process X(j) that assumes the values A with equal probability. A typical sample function of X(¢) is shown in Fig. 7-7. The average number of polarity switches (zero crossings) per unit xo | UU Fig. 7-7 Telegraph signal 182 RANDOM PROCESSES (CHAP. 7 time is a. The probability Of having exactly k crossings in time + is given by the Poisson distribution (Eq, 6.88) 5 (an i where Z is the random variable representing the number of zero crossings. The process X(f) is known as the elegraph signal. Find the autocorrelation and the power spectrum of X(t). PZ=b (7.100) 1f¢ is any positive time interval, then PROX+ OL = AP(X() and X(-+ 9) have same signs] + CAYPIXG and X(C+ 2) have different signs] = A°PIZ even in (1+ 9] = A°PLZ odd in (1 +9] Rett +0 ay nat a (oot ey net a ee wen Fey en FN pene atom tom which indicates that the autocorrelation depends only on the time difference *. By Eq, (7.26), the complete solution that includes + < 0 is given by Rx) = (7.102) which is sketched in Fig. 7-8(a) Taking the Fourier transform of both sides of Eq. (7.102), we see that the power spectrum of X() is (Ea. (55) fe fe(o) = A 7.103 Sato) = 4 ay ea which is sketched in Fig. 7.8). me Sexo) 3 4 Seen o Ps Fig. 7-8 7.12. Consider a random binary process X() consisting of a random sequence of binary symbols 1 and 0. A typical sample function of X() is shown in Fig. 7-9. It is assumed that 1, The symbols 1 and 0 are represented by pulses of amplitude +A and ~A V, respectively, and duration 7, s. 2, The two symbols 1 and 0 are equally likely, and the presence of a 1 or 0 in any one interval is independent of the presence in all other intervals. CHAP. 7] RANDOM PROCESSES 183 3. The pulse sequence is not synchronized, so that the starting time ty of the first pulse after ¢ = 0 is ‘equally likely to be anywhere between 0 to Ty. That is, tg is the sample value of a random variable T, uniformly distributed over (0, Ts) ne “A Fig. 7-9 Random binary signal Find the autocorrelation and power spectrum of X(t) ‘The random binary process X(P) can be represented by x= 5 Agee) (7.104) where {4} is a sequence of independent random variables with P[A, = A] = PLA, amplitude pulse of duration 7, and Ty is a random variable uniformly distributed over (0, 7} » pC) is a unit ald) = EXO) = FA) = A +30) = 0 7.105) Let, > th When fy > Th then and t,t fl in ferent pulse intervals (Fig. 7-10 ()] and the random variables X(t) and X() are therefore independent. We ths have IX )K(e)] = ELRCANIETKC) = 0 7.106) When 24 o Fig. 7-20 CHAP. 7] RANDOM PROCESSES 199 136. 731. 1.38. 7.239, Which is plotted in Fig. 7-20(6). From Fig. 7-20(a) 1 EXO] FJ, Saloon = bof on Law 2 From Fig. 7-206) 2, 1 w 1 AZO} = IO] = 5.0) [nde = ZEMeW) = 2nd Hence, EXO) X20] HO] = en2W) = 2B 7.161) Supplementary Problems Consider a random process X(#) defined by X( = cos OF where Q is a random variable uniformly distributed over [0,c]. Determine whether X(0 is stationary. Ans, Nonstationary Hint: Examine specific sample functions of X() for different frequencies, say, Q = n/2, m, and 2e. Consider the random process X() defined by X(@) = Acos or Where @ is a constant and 4 is a random variable uniformly distributed over (0, 1). Find the autocorrelation and autocovariance of X(). Ans. Realist Caux(ta fa) = 73008 1,608 fy Let X(9 be a WSS random process with autoeorrelation Ryx(s) = Ae" Find the second moment of the random vasisble ¥ = X(5)~ X(2) Ans, 2A(1-€*) Let X() be a zero-mean WSS random process with autocorelaton Ryx(). A random variable Y is formed by integrating 40): if Hf oa Find the mean and variance of ¥. TJo Ans. jy = Rex) 200 RANDOM PROC! [CHAP. 7 740. A sample function of a random telegraph signal X() is shown in Fig. 7-21. This signal makes independent random shifts between two equally likely values, A and 0. The number of shifts per unit time is governed by the Poisson distribution with parameter 0. x if Fig. 7-21 (a) Find the autocorrelation and the power spectrum of X(). (©) Find the rms value of X(0. a 2 Mt eh Selo a Tbe Sela) =F OOF Ans, (a) B(o) +A Rex! A 2 TAL. Suppose that X(0) is @ gaussian process with By = 2 Ryyls) = Se Find the probability that X(4) = 1 Ans. 0.159 7.42. The output of a filter is given by ¥() = X@41)-XU- where X() is a WSS process with power spectrum Syx(o) and Tis constant. Find the power spectrum of ¥() Ans, Spa) = 4sin *@TSy(0) 743. Let X() is the Hilbert transform of a WSS process X(0). Show that Ryg(O) = ELXOK)] = 0 Hint, Use relation (b) of Prob, 7.20 and definition (2.26). 7.44, When a metallic resistor R is at temperature T, random electron motion produces a noise voltage V(0) atthe open- circuited terminals. This voltage V() is known as the thermal noise. Its power spectrum Syy(c) is practically ‘constant for f = 10"? Hz and is given by RE wo c vo I CHAP. 7] RANDOM PROCESSES 201 Syy(o) = 2kTR where k= Boltzmann constant = 1.3707 T= absolute temperature, Kelvins (K) R= resistance, ohms (Q) Calculate the thermal noise voltage (rms value) across the simple RC circuit shown in Fig. 7-22 with R (kQ), C= 1 microfarad (uF), at T = 27° C. joules per kelvin (S/K) kilobm ee

Potrebbero piacerti anche