Sei sulla pagina 1di 19

Click Here for Probability and Random Processes full study material.

Probability & Random Process


Formulas
UNIT-I (RANDOM VARIABLES)

1) Discrete random variable:


A random variable whose set of possible values is either finite or countably
infinite is called discrete random variable.
Eg: (i) Let X represent the sum of the numbers on the 2 dice, when two
dice are thrown. In this case the random variable X takes the values 2, 3, 4, 5, 6,
7, 8, 9, 10, 11 and 12. So X is a discrete random variable.
(ii) Number of transmitted bits received in error.
2) Continuous random variable:
A random variable X is said to be continuous if it takes all possible values
between certain limits.
Eg: The length of time during which a vacuum tube installed in a circuit
functions is a continuous random variable, number of scratches on a surface,
proportion of defective parts among 1000 tested, number of transmitted in
error.
3)
Sl.No. Discrete random variable Continuous random variable
 
1

i 
p( xi )  1
∫ f ( x)dx  1


F ( x)  P  X  x
x
2
F ( x)  P  X  x  ∫ f ( x)dx


Mean  E  X   ∑ xi p( xi )

3
i Mean  E  X   ∫ xf ( x)dx


4 E X 2
 ∑ x p( x )2 

E X 2  ∫ x2 f ( x)dx
i i
i


Var  X   E  X 2   E  X  Var  X   E  X 2   E  X 
5 2 2

6 Moment = E X r  ∑ x r p 

 ∫ x r f ( x)dx
i i r
i Moment = E X

7 M.G.F M.G.F

www.BrainKart.com
Click Here for Probability and Random Processes full study material.

M t E e tX  ∑ e tx p( x) M t   E etX 

etx f ( x)dx

X
x X


4) E aX  b   aE  X   b
5) Var  aX  b   a 2 Var  X 
6) Var aX  bY   a2 Var  X   b2Var Y 
7) Standard Deviation  Var  X 
8) f ( x)  F ( x)
9) p( X  a)  1  p( X  a)
p  A ∩ B
10) p  A / B   , p B  0
p  B 
11) If A and B are independent, then p  A ∩ B   p  A  p  B  .

12) 1st Moment about origin = E  X  = M X   t  (Mean)


t 0

2nd Moment about origin E X2 = M X  t 


t 0
=
tr
The co-efficient of = E Xr (rth Moment about the origin)
r!

13) Limitation of M.G.F:


i) A random variable X may have no moments although its m.g.f exists.
ii) A random variable X can have its m.g.f and some or all moments, yet the
m.g.f does not generate the moments.
iii) A random variable X can have all or some moments, but m.g.f does not
exist except perhaps at one point.
14) Properties of M.G.F:
i) If Y = aX + b, then M Y t   ebt M X at  .
ii) McX  t   M X ct  , where c is constant.
iii) If X and Y are two independent random variables then
MX Y t   MX t   MY t  .
15) P.D.F, M.G.F, Mean and Variance of all the distributions:
Sl. Distributio P.D.F ( P( X  x) ) M.G.F Mean Variance
No. n

 q  pe  np npq
1 Binomial
nc xp x q n x t
n

2 Poisson
e   x e

 et 1   
x!
3 Geometric x 1
q p (or) qx p pet 1 q
1  qet p p2
www.BrainKart.com
Click Here for Probability and Random Processes full study material.

4 Uniform
1 ebt  eat a  b (b  a)2
,axb
f ( x)  ba (b  a)t 2 12
0,   x otherwise
5 Exponential e , x  0,   0  1 1
f ( x) 
0, otherwise t  2
6 Gamma
e x x 1 1  
f ( x)  , 0  x  ,   0
( ) (1  t )

f ( x)  e 2 
7 Normal 1 1 x  2
 t
t 2 2
2
e 2
 2

16) Memoryless property of exponential distribution


P XSt/XS P Xt .
dx
17) Function of random variable: fY ( y)  f X ( x)
dy

UNIT-II (RANDOM VARIABLES)

1) ∑∑ i j
pij  1 (Discrete random variable)

  
∫∫ f ( x, y)dxdy  1 (Continuous random variable)
 
P  x, y 
2) Conditional probability function X given Y P  X  xi / Y  yi   .
P( y)
P  x, y 
Conditional probability function Y given X P Y  yi / X  xi   .
P( x)

P  X  a,Y  b
P  X  a / Y  b 
P(Y  b)
f ( x, y)
3) Conditional density function of X given Y, f ( x / y)  .
f ( y)
f ( x, y)
Conditional density function of Y given X, f ( y / x)  .
f ( x)
4) If X and Y are independent random variables then
f ( x, y)  f ( x). f ( y) (for continuous random variable)

www.BrainKart.com
Click Here for Probability and Random Processes full study material.

P  X  x,Y  y   P  X  x  .P Y  y  (for discrete random variable)


d b

5) Joint probability density function P  a  X  b, c  Y  d   ∫ ∫ f ( x, y)dxdy .


c a

b a

P  X  a,Y  b  ∫ ∫ f ( x, y)dxdy
0 0





6) Marginal density function of X, f ( x)  f X ( x) 




f ( x, y)dy





Marginal density function of Y, f ( y)  fY ( y)  f ( x, y)dx


∫ 

7) P( X  Y  1)  1  P( X  Y  1)

Cov( X ,Y )
8) Correlation co – efficient (Discrete):  ( x, y) 
 X Y
1
Cov( X ,Y )  ∑ XY  XY ,  X  1 ∑ X 2  X 2 , Y  1
n∑
Y2  Y 2
n n

Cov( X ,Y )
9) Correlation co – efficient (Continuous):  ( x, y) 
 X Y
Cov( X ,Y )  E  X ,Y   E  X  E Y  ,  X  Var( X ) , Y  Var(Y )

10) If X and Y are uncorrelated random variables, then Cov( X ,Y )  0 .


      
11) E  X   ∫ xf (x)dx , E Y   ∫ yf ( y)dy , E  X ,Y   ∫∫ xyf ( x, y)dxdy .
   

12) Regression for Discrete random variable:

Regression line X on Y is x  x  bxy  y  y  , bxy 


∑ x  x  y  y 
2

∑  y  y 
Regression line Y on X is y  y  byx  x  x  , byx 
∑  x  x  y  y 
2
∑  x  x 
Correlation through the regression,    bXY .bYX Note:  ( x, y)  r( x, y)

www.BrainKart.com
Click Here for Probability and Random Processes full study material.

13) Regression for Continuous random variable:


x
Regression line X on Y is x  E( x)  bxy  y  E( y)  , b xy  r 
 y

 y
Regression line Y on X is y  E( y)  byx  x  E( x)  , b r
yx
x


Regression curve X on Y is x  E  x / y  ∫ x f  x / y  dx




Regression curve Y on X is y  E  y / x  ∫ y f  y / x  dy


14) Transformation Random Variables:

f ( y)  f ( x) dx (One dimensional random variable)


Y X
dy

u u
x y
fUV (u, v)  f XY ( x, y) (Two dimensional random variable)
v v
x y

15) Central limit theorem (Liapounoff’s form)


If X1, X2, …Xn be a sequence of independent R.Vs with E[Xi] = µi and Var(Xi) = σi2, i
= 1,2,…n and if Sn = X1 + X2 + … + Xn then under certain general conditions, S n
n n

follows a normal distribution with mean   ∑ i and variance   ∑


2 2
i as
i 1 i 1

n.
16) Central limit theorem (Lindberg – Levy’s form)
If X1, X2, …Xn be a sequence of independent identically distributed R.Vs with E[X i]
= µi and Var(Xi) = σi2, i = 1,2,…n and if Sn = X1 + X2 + … + Xn then under certain
general conditions, S n follows a normal distribution with mean n and variance

n 2 as n   . 

Sn  n X  
Note: z  ( for n variables), z ( for single variables)
 n 
n

www.BrainKart.com
Click Here for Probability and Random Processes full study material.

UNIT-III (MARKOV PROCESSES AND MARKOV CHAINS)

1) Random Process:
A random process is a collection of random variables {X(s,t)} that are
functions of a real variable, namely time ‘t’ where s Є S and t Є T.

2) Classification of Random Processes:


We can classify the random process according to the characteristics of time t
and the random variable X. We shall consider only four cases based on t and X
having values in the ranges -∞< t <∞ and -∞ < x < ∞.

Continuous random process

Continuous random sequence

Discrete random process

Discrete random sequence

Continuous random process:


If X and t are continuous, then we call X(t) , a Continuous Random Process.
Example: If X(t) represents the maximum temperature at a place in the
interval (0,t), {X(t)} is a Continuous Random Process.
Continuous Random Sequence:
A random process for which X is continuous but time takes only discrete values is
called a Continuous Random Sequence.
Example: If Xn represents the temperature at the end of the nth hour of a day, then
{Xn, 1≤n≤24} is a Continuous Random Sequence.
Discrete Random Process:
If X assumes only discrete values and t is continuous, then we call such random
process {X(t)} as Discrete Random Process.
Example: If X(t) represents the number of telephone calls received in the interval
(0,t) the {X(t)} is a discrete random process since S = {0,1,2,3, . . . }
Discrete Random Sequence:
A random process in which both the random variable and time are discrete is called
Discrete Random Sequence.
Example: If Xn represents the outcome of the nth toss of a fair die, the {Xn : n≥1} is a
discrete random sequence. Since T = {1,2,3, . . . } and S = {1,2,3,4,5,6}

www.BrainKart.com
Click Here for Probability and Random Processes full study material.

3) Condition for Stationary Process: E  X (t )  Constant ,Var  X (t )  constant .


If the process is not stationary then it is called evolutionary.

4) Wide Sense Stationary (or) Weak Sense Stationary (or) Covariance Stationary:
A random process is said to be WSS or Covariance Stationary if it satisfies the
following conditions.
i) The mean of the process is constant (i.e) E  X (t )  constant .
ii) Auto correlation function depends only on  (i.e)
RXX ( )  E  X (t ).X (t   )
5) Time average:
T
1
The time average of a random process  X (t ) is defined as XT  ∫ X (t ) dt .
2T T
T
1
If the interval is  0,T  , then the time average is XT 
T ∫ X (t ) dt .
0
6) Ergodic Process:
A random process  X (t ) is called ergodic if all its ensemble averages are
interchangeable with the corresponding time average XT .
7) Mean ergodic:
Let  X (t ) be a random process with mean E  X (t )   and time average XT ,
then  X (t ) is said to be mean ergodic if XT   as T   (i.e)
E  X (t )  Lt X .T
 

T 

Note: Lt var  XT   0 (by mean ergodic theorem)


T 
8) Correlation ergodic process:
The stationary process  X (t ) is said to be correlation ergodic if the process
Y (t ) is mean ergodic where Y (t )  X (t ) X (t   ) . (i.e) E Y (t )  TLt
 

Y T.


Where YT is the time average of Y (t ) .


9) Auto covariance function:
C XX ( )  RXX ( )  E  X (t ) E  X (t   )
10) Mean and variance of time average:
1T
E  X (t ) dt
T ∫0
Mean: E XT 

1 2T

Variance: Var X
T ∫ RXX ( )CXX ( ) d
2T 2T

www.BrainKart.com
Click Here for Probability and Random Processes full study material.

11) Markov process:


A random process in which the future value depends only on the present value
and not on the past values, is called a markov process. It is symbolically
represented by P X (tn1 )  xn1 / X (tn )  xn , X (tn1 )  xn1 ...X (t0 )  x0
P X (tn1 )  xn1 / X (tn )  xn
Where t0  t1  t2  ...  tn  tn1
12) Markov Chain:
If for all n ,
P X n  an / X n1  an1 , X n 2  an 2 , ...X 0  a0 P X n  an / X n1  an1
then the process  X n  , n  0,1, 2, ... is called the markov chain. Where
a0 , a1 , a2 , ...an , ... are called the states of the markov chain.
13) Transition Probability Matrix (tpm):
When the Markov Chain is homogenous, the one step transition probability is
denoted by Pij. The matrix P = {Pij} is called transition probability matrix.
14) Chapman – Kolmogorov theorem:
If ‘P’ is the tpm of a( homogeneous n Markov chain, then the n – step tpm P(n) is
equal to P . (i.e) P  P .
n n)

ij ij

15) Markov Chain property: If     1 ,  2 ,  3  , then P   and


1  2  3  1 .
16) Poisson process:
If X (t ) represents the number of occurrences of a certain event in (0, t ) ,then
the discrete random process  X (t ) is called the Poisson process, provided the
following postulates are satisfied.

(i) P 1 occurrence in (t, t  t )  t  O  t 


(ii) P 0 occurrence in (t, t  t )  1  t  O  t 
(iii) P 2 or more occurrences in (t, t  t )  O  t 
(iv) X (t ) is independent of the number of occurrences of the event in any
interval.
et   t 
x

17) Probability law of Poisson process: P  X (t )  x  , x  0,1, 2, ...


x!
Mean E  X (t )   t , E X 2 (t )   2 t 2   t , Var  X (t )   t .

UNIT-IV (CORRELATION AND SPECTRAL DENSITY)

RXX   - Auto correlation function

www.BrainKart.com
Click Here for Probability and Random Processes full study material.

SXX    - Power spectral density (or) Spectral density

RXY   - Cross correlation function

SXY   - Cross power spectral density

1) Auto correlation to Power spectral density (spectral density):




SXX     ∫ RXX   e i d




2) Power spectral density to Auto correlation:



1
RXX    ∫ SXX   ei d
2 

3) Condition for X (t ) and X (t   ) are uncorrelated random process is

CXX ( )  RXX ( )  E  X (t ) E  X (t   )  0

4) Cross power spectrum to Cross correlation:



1
RXY    ∫ S XY    ei d
2 

5) General formula: 
eax
∫ e cos bx dx  a2  b2 a cos bx  b sin bx 
ax
i)

eax
∫ e sin bx dx  a2  b2 a sin bx  b cos bx 
ax
ii)

 a
2
a2
iii) x 2  ax  x   4
2

ei  e i
iv) sin  

2i
 ei  e i
v) cos  

www.BrainKart.com
Click Here for Probability and Random Processes full study material.

UNIT-V (LINEAR SYSTEMS WITH RANDOM INPUTS)

1) Linear system:
f is called a linear system if it satisfies

f a1 X 1 (t )  a 2 X 2 (t )  a1 f  X 1 (t )   a 2 f  X 2 (t ) 

2) Time – invariant system:


Let Y (t )  f  X (t ) . If Y (t  h)  f  X (t  h) then f is called a time –

invariant system.
3) Relation between input X (t ) and output Y (t ) :


Y (t )  ∫ h(u) X (t  u) du


Where h(u) system weighting function.


4) Relation between power spectrum of X (t ) and output Y (t ) :

SYY ( )  S XX ( ) H ( )
2



If H ( ) is not given use the following formula H ( )  e j t h(t ) dt


∫ 

5) Contour integral:
  ma
e imx
∫ a2  x2  a e

(One of the result)

 1 e a 
1


6) F (from the Fourier transform)
a 2 2
2a

www.BrainKart.com
Click Here for Probability and Random Processes full study material.

UNIT I: PROBABILITY AND RANDOM VARIABLES

PART B QUESTIONS

1. A random variable X has the following probability distribution

X 0 1 2 3 4 5 6 7
P(X) 0 K 2k 2k 3k 2 7k2+k
k2 2k
Find (i) The value of k, (ii) P[ 1.5 < X < 4.5 / X >2 ] and (iii) The smallest value of λ
for which P(X ≤ λ) < (1/2).
2. A bag contains 5 balls and its not known how many of them are white. Two balls
are drawn at random from the bag and they are noted to be white. What is the
chance that the balls in the bag all are white.
1 x
3. Let the random variable X have the PDF f(x) = e 2 , x >0 Find the moment
2
generating function, mean and variance.
4. A die is tossed until 6 appear. What is the probability that it must tossed
more than 4 times.
5. A man draws 3 balls from an urn containing 5 white and 7 black balls. He
gets Rs. 10 for each white ball and Rs 5 for each black ball. Find his
expectation.
6. In a certain binary communication channel, owing to noise, the probability
that a transmitted zero is received as zero is 0.95 and the probability that a
transmitted one is received as one is 0.9. If the probability that a zero is
transmitted is 0.4, find the probability that (i) a one is received (ii) a one was
transmitted given that one was received
7. Find the MGF and rth moment for the distribution whose PDF is f(x) = k e –x ,
x >0. Find also standard deviation.
8. The first bag contains 3 white balls, 2 red balls and 4 black balls. Second
bag contains 2 white, 3 red and 5 black balls and third bag contains 3
white, 4 red and 2 black balls. One bag is chosen at random and from it 3
balls are drawn. Out of 3 balls, 2 balls are white and 1 is red. What are the
probabilities that they were taken from first bag, second bag and third bag.
9. A random variable X has the PDF f(x) = 2x, 0 < x < 1 find (i) P (X < ½) (ii) P (
¼ < X < ½) (iii) P ( X > ¾ / X > ½ )
10. If the density function of a continuous random variable X is given by
ax 0≤x≤1
a 1≤x≤2
f(x) = 3a – ax 2≤x≤3
0 otherwise

(1) Find a (2) Find the cdf of X

11. If the the moments of a random variable X are defined by E ( X r ) = 0.6,


r = 1,2,.. Show that P (X =0 ) = 0.4 P ( X = 1) = 0.6, P ( X  2 ) = 0.

www.BrainKart.com
Click Here for Probability and Random Processes full study material.

12. In a continuous distribution, the probability density is given by f(x) = kx (2


– x) 0 < x < 2. Find k, mean , varilance and the distribution function.

13. The cumulative distribution function of a random variable X is given by


0, x<0
2
x, 0≤x≤½
3
F(x) = 1 - (3  x) 2
½≤x≤3
25
1 x 3
Find the pdf of X and evaluate P ( |X| ≤ 1 ) using both the pdf and cdf

14. Find the moment generating function of the geometric random variable
with the pdf f(x) = p q x-1, x = 1,2,3.. and hence find its mean and variance.

15. A box contains 5 red and 4 white balls. A ball from the box is taken our at
random and kept outside. If once again a ball is drawn from the box, what
is the probability that the drawn ball is red?
16. A discrete random5 variable X has moment generating function
1 3 
M X(t) =   et  Find E(x), Var(X) and P (X=2)
 4 4 
17. The pdf of the samples of the amplitude of speech wave foem is found to
decay exponentially at rate , so the following pdf is proposed f(x) =
Ce|x| , -  < X < . Find C, E(x)

18. Find the MGF of a binomial distribution and hence find the mean and variance.
. Find the recurrence relation of central moments for a binomial distribution.

19. The number of monthly breakdowns of a computer is a RV having a poisson


distribution with mean equal to 1.8. Find the probability that this computer will
function for a month (a) without a breakdown, (b) Wish only one breakdown, (c)
Wish at least one break down.
20. Find MGF and hence find mean and variance form of binomial distribution.

21. State and prove additive property of poisson random variable.

22. If X and Y are two independent poisson random variable, then show that
probability distribution of X given X+Y follows binomial distribution.
23. Find MGF and hence find mean and variance of a geometric distribution.

24. State and prove memory less property of a Geometric Distribution.

25. Find the mean and variance of a uniform distribution.

www.BrainKart.com
Click Here for Probability and Random Processes full study material.

UNIT- II TWO DIMENSIONAL RANDOM VARIABLES


Part- B
1. If f (x, y) = x+y , 0< x <1, 0< y < 1
0 , Otherwise
Compute the correlation cp- efficient between X and Y.

2. The joint p.d.f of a two dimensional randm variable (X, Y) is given by f(x, y) = (8
/9) xy, 1 ≤ x ≤ y ≤ 2 find the marginal density unctions of X and Y. Find also the
conditional density function of Y / X =x, and X / Y = y.

3. The joint probability density function of X and Y is given by f(x, y) = (x + y) /3, 0


≤ x ≤1 & 0<y <2 obtain the egression of Y on X and of X on Y.
4. If the joint p.d.f. of two random variable is given by f(x1, x2) = 6 e-2x1 – 3 x2, x1
> 0, x2 >0. Find the probability that the first random variable will take on a value
between 1 and 2 and the second random variable will take on avalue between 2
and 3. \also find the probability that the first random variable will take on a value
less than 2 and the second random variable will take on a value greater than 2.

5. If two random variable have hoing p.d.f. f(x1, x2) = ( 2/ 3) (x1+ 2x2) 0< x1 <1,
0< x2 < 1
6. Find the value of k, if f(x,y) = k xy for 0 < x,y < 1 is to be a joint density function.
Find P(X + Y < 1 ) . Are X and Y independent.
7. If two random variable has joint p.d.f. f(x, y) = (6/5) (x +y2), 0 < x < 1 , 0< y
<1.Find P(0.2 < X < 0.5) and P( 0.4 < Y < 0.6)
8. Two random variable X and Y have p.d.f f(x, y) = x2 + ( xy / 3), 0 ≤ x ≤1,
0≤ y ≤ 2. Prove that X and Y are not independent. Find the conditional density
function

 x 2  y2 
9. X and Y are 2 random variable joint p.d.f. f(x, y) = 4xy e , x ,y ≥ 0, find
2 2
the p. d. f. of x +y .

10. Two random variable X and Y have joint f(x y) 2 – x –y, 0< x <1, 0< y < 1. Find
the Marginal probability density function of X and Y. Also find the conditional
density unction and covariance between X and Y.

11. Let X and Y be two random variables each taking three values –1, 0 and
1 and having the joint p.d.f.
X
Y
-1 0 1 Prove that X and Y have different
-1 0 0.1 0.1 expections. Also Prove that X and Y are
0 0.2 0.2 0.2 uncorrelated and find Var X and Var Y
1 0 0.1 0.1

www.BrainKart.com
Click Here for Probability and Random Processes full study material.

12. 20 dice are thrown. Find the approximate probability tat the sum obtained is
between 65 and 75 using central limit theorem.

13. Examine whether the variables X and Y are independent whose joint density is
–xy – x, 0< x , y < ∞
f(x ,y) = x e
.
14. Let X and Y be independent standard normal random variables. Find the pdf of
z =X / Y.

15. Let X and Y be independent uniform random variables over (0,1) . Find the
PDF of Z = X + Y

UNIT-III CLASSIFICATION OF RANDOM PROCESS


PART B

1. The process { X(t) } whose proabability distribution is given by

 at 
n1
P [ X(t) = n] = , n  1, 2...
1 at n1
at 
= , n 0
1 at Show that it is not stationary

2. A raining process is considered as a two state Markov chain. If it rains, it is


considered to be in state 0 and if it does not rain, the chain is in state 1. the
 0.6 0.4 
transitioin probability of the markov chain is defined as P   . Find the
 0.2 0.8 
probability of the Markov chain is defined as today assuming that it is raining
today. Find also the unconditional probability that it will rain after three days with
the initial probabilities of state 0 and state 1 as 0.4 and 0.6 respectively.
3. Let X(t) be a Poisson process with arrival rate . Find E {( X (t) – X (s)2 } for t > s.
4. Let { Xn ; n = 1,2..} be a Markov chain on the space S = { 1,2,3} with on step
 0 1 0 
 
transition probability matrix P  1/ 2 0 1/ 2  (1) Sketch transition diagram (2) Is
 1 0 0 
 
the chain irreducible? Explain. (3) Is the chain Ergodic? Explain
5. Consider a random process X(t) defined by X(t) = U cost + (V+1) sint, where U
and V are independent random variables for which E (U ) = E(V) = 0 ; E (U 2) = E

www.BrainKart.com
Click Here for Probability and Random Processes full study material.

( V2 ) = 1 (1) Find the auto covariance function of X (t) (2) IS X (t) wide sense
stationary? Explain your answer.
6. Discuss the pure birth process and hence obtain its probabilities, mean and
variance.
7. At the receiver of an AM radio, the received signal contains a cosine carrier
signal at the carrier frequency  with a random phase  that is uniform distributed
over ( 0,2). The received carrier signal is X (t) = A cos(t +  ). Show that the
process is second order stationary
8. Assume that a computer system is in any one of the three states busy, idle and
under repair respectively denoted by 0,1,2. observing its state at 2 pm each day,
 0.6 0.2 0.2 
we get the transition probability matrix as P  0.1 0.8 0.1 . Find out the 3rd

 
 0.6 0 0.4 
 
step transition probability matrix. Determine the limiting probabilities.

9. Given a random variable  with density f () and another random variable 
uniformly distributed in (-, ) and independent of  and X (t) = a cos (t + ),
Prove that { X (t)} is a WSS Process.
10. A man either drives a car or catches a train to go to office each day. He never
goes 2 days in a row by train but if he drives one day, then the next day he is just
as likely to drive again as he is to travel by train. Now suppose that on the first
day of the week, the man tossed a fair die and drove to work iff a 6 appeared.
Find (1) the probability that he takes a train on the 3rd day. (2) the probability that
he drives to work in the long run.
11. Show that the process X (t) = A cost + B sin t (where A and B are random
variables) is wide sense stationary, if (1) E (A ) = E(B) = 0 (2) E(A 2) = E (B2 ) and
E(AB) = 0
12. Find probability distribution of Poisson process.
13. Prove that sum of two Poisson processes is again a Poisson process.
14. Write classifications of random processes with example

Unit 4
Correlation and Spectrum Densities

www.BrainKart.com
Click Here for Probability and Random Processes full study material.

V+ TEAM

www.BrainKart.com
Click Here for Probability and Random Processes full study material.

www.BrainKart.com
Click Here for Probability and Random Processes full study material.

www.BrainKart.com
Click Here for Probability and Random Processes full study material.

www.BrainKart.com

Potrebbero piacerti anche