Documenti di Didattica
Documenti di Professioni
Documenti di Cultura
Laboratory Manual
Manual made by
Dr. V.B.MALODE
Prof. V. A.KULKARNI
MGM’S
Jawaharlal Nehru Engineering College
N-6, CIDCO, Aurangabad
Department of Electronics &Telecommunication
2. To provide various platforms to students for cultivating professional attitude and ethical
values.
3. Creating a strong foundation among students which will enable them to pursue their career
choice.
2
ETC 324: Information Theory and coding 2018
Technical Document
Recommended by,
HOD
Approved by,
Principal
Copies:
• Departmental Library
• Laboratory
• HOD
• Principal
3
ETC 324: Information Theory and coding 2018
FOREWORD
It is my great pleasure to present this laboratory manual for Third year engineering students for
the subject of Information Theory and coding, keeping in view the vast coverage required for
visualization of concepts of ITC.
As a student, many of you may be wondering with some of the questions in your mind regarding
the subject and exactly that has been tried to answer through this manual.
Faculty members are also advised to cover these aspects in initial stage itself, It will greatly
relieve them in future, as much of the load will be taken care by the enthusiasm of the students
once they are conceptually clear. Students are advised to thoroughly go through this manual
rather than only the topics mentioned in the syllabus, as practical aspects are the key to
understanding and conceptual visualization of theoretical aspects covered in the books.
H.O.D.
4
ETC 324: Information Theory and coding 2018
SUBJECT INDEX
3 Lab Exercises
I Pre-Requisite 1
II Pre-Requisite 2: Introduction to Matlab 8
1 Determination of entropy of a given source 14
2 Determination of various entropies and mutual information of
15
a given channel (Noise free channel)
3 Determination of various entropies and mutual information of
18
a given channel (Binary symmetric channel)
4 Generation and evaluation of variable length source coding
21
using MATLAB (Huffman Coding and decoding)
5 Coding & decoding of Linear block codes 24
6 Coding & decoding of Cyclic codes 26
7 Coding and decoding of convolutional codes 28
8 Coding and decoding of BCH codes. 30
I Postrequisite 33
2 Questions based on the subject 39
3 Conduction of viva voce examination 44
4 Evaluation and marking scheme 44
5
ETC 324: Information Theory and coding 2018
1. Lab work completed during prior session should be corrected during the
next lab session.
2. Students should be guided and helped whenever they face difficulties.
3. The promptness of submission should be encouraged by way of marking
and evaluation patterns that will benefit the sincere students.
6
ETC 324: Information Theory and coding 2018
Course Outcomes:
Students will be able to
1. Demonstrate various entropies and information.
2. Apply source coding techniques.
3. Construct codes using different coding techniques.
4. Explain various coding schemes for text ,speech and audio.
7
ETC 324: Information Theory and coding 2018
8
ETC 324: Information Theory and coding 2018
OUTPUT: X =
1 2 3
4 5 6
Y=
6 5 4
3 2 1
F=
7 7 7
7 7 7
7) X is row vector, change it to column vector.
COMMAND: Z = X'
OUTPUT: Z =
1 4
2 5
3 6
8) Generate the following matrix
COMMAND: A=[1,2,3;4,5,6;7,8,9]
C=[1^3,2+sqrt(3),3*sin(1);exp(2),17/3,pi+3;1/3,2-sqrt(3),-7*cos(pi/7)]
x=[1 2 3 4 5 6]'
OUTPUT: A=
1 2 3
4 5 6
7 8 9
9
ETC 324: Information Theory and coding 2018
C=
1.0000 3.7321 2.5244
7.3891 5.6667 6.1416
0.3333 0.2679 -6.3068
x=
1
2
3
4
5
6
9) Generate 5x3 matrix whose first three rows are rows of A and last two rows are all ones.
COMMAND: w=[A;ones(2,3)]
OUTPUT: w=
1 2 3
4 5 6
7 8 9
1 1 1
1 1 1
10
ETC 324: Information Theory and coding 2018
F=
1 1 5 10 1 1
2 2 5 10 2 2
5 10 1 1 5 10
5 10 2 2 5 10
G=
1.0000 1.0000 5.0000 10.0000 1.0000 1.0000 0.8147
2.0000 2.0000 5.0000 10.0000 2.0000 2.0000 0.9058
5.0000 10.0000 1.0000 1.0000 5.0000 10.0000 0.1270
5.0000 10.0000 2.0000 2.0000 5.0000 10.0000 0.9134
(10) Using matrix A, find 1.A(1) 2. A(2) 3. A(9) 4. A(1,1) 5. A(2,1) 6. A(1,2) 7. A(1:3, 2)
8. A([1 3 4], [1 3]) 9. a = A(:, 2) 10. a = A(:) 11. A(2,end)
11
ETC 324: Information Theory and coding 2018
12
ETC 324: Information Theory and coding 2018
OUTPUT: Z=
1 2 3 4 5 6
7 8 9 10 11 12
13 14 15 16 17 18
19 20 21 22 23 24
25 26 27 28 29 30
31 32 33 34 35 36
Z=
1 2 6
7 8 12
13 14 18
19 20 24
25 26 30
31 32 36
13
ETC 324: Information Theory and coding 2018
EXPERIMENT NO. 1
Determination of Entropy
Conclusion:
Program:
(1)% Find entropy of the source
clc;
clear all;
close all;
i=input('Enter no. of elements=');
p=input('Enter probabilities=');
sum=0;
for n=1:i
H=sum+(p(n)*log2(1/p(n)));
sum=H;
end
disp('H(x): ');
disp(H);
output:
Enter no. of elements=2
Enter probabilities=[2/3,4/5]
H(x):
0.6475
14
ETC 324: Information Theory and coding 2018
Experiment no.2
Determination of various entropies and mutual information of the given channel.
Aim: Write a program for determination of various entropies and mutual information of a
given
channel. Test various types of channel such as
a) Noise free channel. b) Error free channel
c) Binary symmetric channel
Compare channel capacity of above channels.
Apparatus: PC,MATAB/C
Theory:
1. Explain discrete memoryless channel.
2. Explain various types of channels with their equations and neat diagrams.
3. Explain in detail mutual information with its equation.
4. Explain relations between various entropies and mutual information, giving their
equations
Algorithm:
I) Entropies:
1. Input the no. of inputs of a channel.
2. Input the no. of outputs of a channel.
3. Input the channel matrix. Test the condition that sum of all the entries in each row
should be equal to 1.
4. Input the channel input probabilities. i.e. P[X].
5. Calculate the entropy of the channel input. i.e. H(X)
6. Calculate output probability matrix P[Y], by multiplying input probability matrix by
channel matrix.
7. Also calculate entropy of channel output. i.e. H(Y).
8. Convert input probability matrix into diagonal matrix.i.e. P[X]d
9. Calculate the joint probability matrix by multiplying input probability matrix in
diagonal form by channel matrix.
10. Calculate joint entropy with the help of formula
11. Calculate conditional entropies H(Y/X)&H(X/Y).
12. Also we can calculate mutual information as
I(X;Y)=H(X)-H(X/Y) or
I(X;Y)=H(Y)-H(Y/X)
15
ETC 324: Information Theory and coding 2018
Conclusion:
Program:
% %prgm for entropy and MI for noise free channel
clc;
clear all;
close all;
i=input('Enter no. of elements=');
q=input('Enter joint probabilities matrix=');
sum=0;
%probability P(x)
for n=1:i
w=0;
for m=1:i
p(n)=w+q(n,m)
w=p(n);
end
end
disp('P(x):');
disp(p);
% entropy H(x)
for n=1:i
H=sum+(p(n)*log2(1/p(n)));
sum=H;
end
disp('H(x): ');
disp(H);
%conditional probability matrix
for n=1:i
for m=1:i
a(n,m)=q(n,m)/p(n);
end
end
disp('P(Y/X):');
disp(a);
% entropy H(Y/X)
d=0;
for n=1:i
16
ETC 324: Information Theory and coding 2018
for m=1:i
if(a(n,m)>0)
H1=d+(q(n,m)*log2(1/a(n,m)));
d=H1
end
end
end
disp('H(Y/X):');
disp(H1);
% MI
m=H-H1;
disp('MI=');
disp(m);
% probability P(Y)
for n=1:i
w=0;
for m=1:i
s(n)=w+q(m,n);
w=s(n);
end
end
disp('P(Y):');
disp(s);
% entropy H(Y)
k=0;
for n=1:i
H2=k+(s(n)*log2(1/s(n)));
k=H2;
end
disp('H(Y): ');
disp(H2);
Output
: Enter no. of elements=3 Enter joint probabilities matrix=[.2 0 0;0 .4
0;0 0 .4]
p= H(Y/X):
0
0.2000 0.4000 0.4000
P(x): MI=
0.2000 0.4000 0.4000 1.5219
H(x):
1.5219 P(Y):
0.2000 0.4000 0.4000
P(Y/X):
1 0 0 H(Y):
0 1 0 1.5219
0 0 1
17
ETC 324: Information Theory and coding 2018
Experiment no.3
Determination of various entropies and mutual information of the given BSC channel.
Aim: Write a program for determination of various entropies and mutual information of a given
channel. (Binary symmetric channel).
Apparatus: PC,MATAB/C
Theory:
1. Explain in detail BSC with neat diagram.
2. Find capacity of BSC channel.
Algorithm:
I) Entropies:
1. Input the no. of inputs of a channel.
2. Input the no. of outputs of a channel.
3. Input the channel matrix. Test the condition that sum of all the entries in each row should
be equal to 1.
4. Input the channel input probabilities. i.e. P[X].
5. Calculate the entropy of the channel input. i.e. H(X)
6. Calculate output probability matrix P[Y], by multiplying input probability matrix by
channel matrix.
7. Also calculate entropy of channel output. i.e. H(Y).
8. Convert input probability matrix into diagonal matrix.i.e. P[X]d
9. Calculate the joint probability matrix by multiplying input probability matrix in diagonal
form by channel matrix.
10. Calculate joint entropy with the help of formula
11. Calculate conditional entropies H(Y/X)&H(X/Y).
12. Also we can calculate mutual information as
I(X;Y)=H(X)-H(X/Y) or
I(X;Y)=H(Y)-H(Y/X)
Conclusion:
18
ETC 324: Information Theory and coding 2018
clc;
clear all;
close all;
i=input('Enter no. of elements=');
p=input('Enter probability=');
q=input('Enter conditional probabilities matrix=');
sum=0;
% entropy H(x)
for n=1:i
H=sum+(p(n)*log2(1/p(n)));
sum=H;
end
disp('H(x): ');
disp(H);
% entropy H(Y/X)
d=0;
for n=1:i
for m=1:i
H1=d+(a(n,m)*log2(1/q(n,m)));
d=H1;
end
end
disp('H(Y/X):');
disp(H1);
% probability P(Y)
for n=1:i
w=0;
for m=1:i
s(n)=w+a(m,n);
w=s(n);
end
end
disp('P(Y):');
disp(s);
% entropy H(Y)
k=0;
for n=1:i
H2=k+(s(n)*log2(1/s(n)));
k=H2;
end
disp('H(Y): ');
disp(H2);
19
ETC 324: Information Theory and coding 2018
Output:
Enter no. of elements=2
Enter probability= [3/4 1/4]
Enter conditional probabilities matrix=[1/3 2/3;2/3 1/3]
H(x):
0.8113
P(X,Y):
0.2500 0.5000
0.1667 0.0833
H(Y/X):
0.9183
P(Y):
0.4167 0.5833
H(Y):
0.9799
MI=
0.0616
20
ETC 324: Information Theory and coding 2018
Apparatus: Matlab/C
Theory:
1. Explain variable length coding.
2. Explain Huffman coding techniques.
3. Solve theoretically and verify using matlab program the given example.
4. Explain the commands:1. Huffmandict, 2. Huffmanenco 3. huffmandeco
Algorithm:
1. Start.
2. Input the total number of probabilities.
3. Arrange the messages in decreasing order of probabilities.
4. Add last two probabilities.
5. Assign them ‘0’ and ‘1’.
6. With addition & other probabilities again sort out the total probabilities.
7. If the addition result is equal to probability of an symbol then put it on the top
8. Repeat the program from step 4 until addition is 1.
9. To find code for particular symbol take the path of probability of symbol and write cod
in reverse fashion.
10. Find out entropy, avg. code word length and efficiency.
11. Stop .
Conclusion:
21
ETC 324: Information Theory and coding 2018
Program:
1. %Write a MATLAB based program for encoding and decoding of Huffman code
%(variable length source coding )
clc;
clear all;
close all;
symbol =[1:5]; % Distinct data symbols appearing in sig
p = [0.1 0.1 0.4 .3 .1]; % Probability of each data symbol
[dict,avglen]=huffmandict(symbol,p)
samplecode = dict{5,2} % Codeword for fifth signal value
dict{1,:}
dict{2,:}
dict{3,:}
dict{4,:}
dict{5,:}
hcode = huffmanenco(symbol,dict); % Encode the data.
dhsig = huffmandeco(hcode,dict); % Decode the code.
disp('encoded msg:');
disp(hcode);
disp('decoded msg:');
disp(dhsig);
code_length=length(hcode)
for m=1:5
H=Hx+(p(m)*log2(1/p(m)));
Hx=H;
end
disp('Hx=');
disp(H);
Efficiency=(Hx/avglen)*100
Output:
dict =
[1] [1x4 double]
[2] [1x4 double]
[3] [ 1]
[4] [1x2 double]
[5] [1x3 double]
avglen =
2.1000
22
ETC 324: Information Theory and coding 2018
Output:
msg =
TEECT
dict =
'T' [1x2 double]
'E' [ 1]
'C' [1x2 double]
avglen =
1.6000
ans =
T
ans =
0 0
ans =
E
ans =
1
ans =
C
ans =
0 1
encoded msg:
0 0 1 1 0 1 0 0
decoded msg:
'T' 'E' 'E' 'C' 'T'
23
ETC 324: Information Theory and coding 2018
Expt no. 5
Algorithm:
1. Start
2. Accept size of LBC block code in terms n and k
3. Accept parity p matrix of size k x (n-k)
4. Generate generator matrix such that G = [Ik | P] Of size k x n in which Ik is an identity
matrix.
5. Generate parity check matrix such that H= [PT | In-k] Of size (n-k) x n in which PT is an
transpose of P matrix.
6. Generate msg. vector
7. Generate code vector by formula, C = MG
8. Display it
9. Also calculate hamming weight of each code word and that is done by calculating total no.
of ones in the code vector. Display it.
10. Calculate detecting capability by Td =dmin- 1 , where dmin is minimum hamming distance.
11. Calculate error correcting capability tc by, tc= (dmin-1) /2
12. Display parity matrix H
13. Calculate syndrome vector for different error pattern ‘E’. S= E.HT
14. Compare this S with each S created for different error pattern where S is matched error is in
that respective bit w.r.t. the error pattern. Display no. of bits where is there. If S of received
vector is ‘0’ then display received vector is correct.
Conclusion:
24
ETC 324: Information Theory and coding 2018
clc;
close all;
n=6;
k=3;
p=[0 1 1 ; 1 0 1; 1 1 0]; % Parity Matrix
d=input('enter three bit message=');
ik=eye(k);
g=cat(2,ik,p);
disp('Generator Matrix:');
disp(g);
c1=mtimes(d,g);
c=mod(c1,2);
disp('The codeword for given message is:');
disp(c);
OUTPUT:
enter three bit message[1 0 0]
Generator Matrix:
1 0 0 0 1 1
0 1 0 1 0 1
0 0 1 1 1 0
The codeword for given message is:
1 0 0 0 1 1
25
ETC 324: Information Theory and coding 2018
Expt. No. 6
Objective:
Error detecting and correcting using Cyclic code.
Software Requirement: MATAB/C
THEORY: (1)Explain coding and decoding of cyclic codes in detail
Cyclic Code:
Cyclic code are the sub-class of linear block codes. They have a property that a cyclic shift
of one code word produces another code word. Suppose there is an n-bit code vector.
X = (xn-1, xn-2, ……………………x1, x0)
Here xn-1, xn-2, ……………………x1, x0 represent the individual bits of the code vector X.
if the code vector is shifted cyclically , then another code vector X is obtained
X1 = (xn-2, xn-3, ……………………x1, x0, xn-1)
Algorithm:
1. Start
2. Get the values of n & k.
3. Get the generator polynomial i.e. its coefficient from user.
4. Get the message vector.
5. Get the message generator matrix.
6. Multiply message polynomial with msg bit shifted by n-k
7. Divide this term i.e. Xn-k d(x) by g(x)
8. To get code word polynomial add Xn-k d(x) with reminder of division.
9. Display the code word.
10. Generate the error pattern & corresponding syndrome with displaying.
11. Enter the received code vector.
12. Divide the received code vector polynomial by generator polynomial.
13. The reminder of division will be the syndrome polynomial.
14. From syndrome detect the corresponding error pattern.
15. Stop.
Conclusion:
26
ETC 324: Information Theory and coding 2018
Program:
%Encoding for (7,4) Cyclic code
clc;
clear all;
%Encoding
n=7; k=4;
p=[1 1 0 ; 1 1 1; 0 0 1 ; 1 0 1]; % Parity Matrix
d=[1 1 0 1]; % Message word
ik=eye(k);
g=cat(2,ik,p);
disp ('Generator Matrix:');
disp(g);
g1=cyclpoly(n,k,'max');
disp(‘g1=’);
disp(g1);
gp=poly2sym(g1);
disp('Generator Polynomial:');
disp(gp);
c1=mtimes(d,g);
c=mod(c1,2);
disp('The codeword for given message is:');
disp(c);
OUTPUT:
g1 =
1 1 0 1
Generator Polynomial:
1 0 0 1 0 1 1
0 1 0 1 1 1 0
0 0 1 0 1 1 1
The codeword for given message is:
1 1 0 1 1 0 0
27
ETC 324: Information Theory and coding 2018
Expt no.7
Objective:
Error detecting and correcting using convolutional code.
Software Requirement: MATAB/C
THEORY:
Explain in detail convolution code
Convolution Code:
Convolution coding is an alternative to block codes. They differ from block codes in that the
encoder contains memory. It means encoder output at any given time is dependent on
presentas well as past inputs.
Convolution codes are commonly specified by three parameter (n , k, m ) where n is no.
of outputs bits(coded), k is no. of inputs bits(msg.), m is memory order.
Algorithm:
1. Start
2. Get the values of n & k.
3. Get the generator polynomial i.e. its coefficient from user.
4. Get the message vector.
5. Get the message generator matrix.
Program:
%cyclic convolution
clc;
clear all;
k=input('ennter the no. of message bits k=');
q=input('given data q=');
fprintf('data polynomial is')
d=poly2sym(q)
n=input('enter the no. of information bits n=');
w=[1,0,0,0];
x=poly2sym(w);
28
ETC 324: Information Theory and coding 2018
e=cyclpoly(n,k);
fprintf('generator polynomial ')
g=poly2sym(e)
z=conv(w,q);
r=poly2sym(z)
[m,v]=gfdeconv(z,e);
fprintf('polynomial')
p=poly2sym(v)
b=r+p;
fprintf('codeword is=')
c=sym2poly(b)
OUTPUT:
29
ETC 324: Information Theory and coding 2018
Expt. No.8
Title: Write a program for coding and decoding of BCH codes.
Objective:
Error detecting and correcting using BCH codes.
Software Requirement: MATAB/C
BCH Code:
The BCH code are the most powerful and widely used random error correcting cyclic codes.
These code were discovered by Hocquenghem in 1959 and independently by Bose
and Choudhary in 1960.
An (n,k) binary BCH code is specified as,
Block length n= 2m – 1
Parity check bits n-k = mtc
Minimum distance dmin ≥ 2 tc +1
Where m ≥ 3 is any integer and tc is no. of error the code is capable of correcting.
Procedure :
1. Given code length n and error correcting capability tc
Conclusion:
30
ETC 324: Information Theory and coding 2018
Program:
% Program for encoding and decoding BCH code
clc;
clear all;
close all;
m=4;
n=2^m-1;%codeword length
k=5;%message length
m=input('enter msg of length 5=');
%m=[1 1 1 0 1];
msg=gf(m);
disp('message=');
disp(msg);
%Find t,error correction cap[ability
[genpoly,t]=bchgenpoly(n,k);
disp('Error correction capability=');
disp(t);
% Encode the message
code=bchenc(msg,n,k);
disp('Encoded message=');
c=gf(code);
disp(c);
noisycode=code+randerr(1,n,1:t);
disp('received codeword =');
disp(noisycode);
% decode noisy code
[newmsg,err,ccode]=bchdec(noisycode,n,k);
disp('decoded message=');
disp(newmsg);
if msg==newmsg
disp('message recovered perfectly')
else
disp('Error in message recovered');
end
Output:
enter msg of length 5=[1 0 0 0 1]
message=
gf object: 1-by-5
Error correction capability=
3
Encoded message=
gf object: 1-by-15
received codeword =
gf object: 1-by-15
decoded message=
gf object: 1-by-5
31
ETC 324: Information Theory and coding 2018
disp(c)
OUTPUT:
accept n=6
accept k=4
accept message=[1 1 1 1]
gf object: 1-by-6
32
ETC 324: Information Theory and coding 2018
Post Requisite
% Create Integrator model using Simulink Matlab
Before creating a model, you need to start MATLAB® and then start Simulink.
2.From the Simulink Library Browser menu, select File > New > Model.
A Simulink Editor window opens with an empty canvas in the right-hand pane
Select File > Save as. The Save As dialog box opens.
In the File name box, enter a name for your model. For example, enter simple_model. Then
click Save.
33
ETC 324: Information Theory and coding 2018
Simulating this model integrates a sine wave signal to a cosine signal and then displays
the result, along with the original signal, in a scope window.
34
ETC 324: Information Theory and coding 2018
2 Get detailed information about a block. Right-click a block, and then select Help for the
<block name>. The Help browser opens with the reference page for the block.
3 View block parameters. Right-click a block, and then select Block Parameters. The block
parameters dialog box opens.
To build a model, begin by copying blocks from the Simulink Library Browser to the
Simulink Editor.
1 In the Simulink Library Browser, select the Sources library.
35
ETC 324: Information Theory and coding 2018
3 Drag the Sine Wave block to the Simulink Editor. A copy of the Sine Wave block
appears in your model.
36
ETC 324: Information Theory and coding 2018
1 From the Simulink Editor menu, select Simulation > Model Configuration Parameters.
The Configuration Parameters dialog box opens to the Solver pane.
2 In the Stop time field, enter 20. In the Max step size field, enter 0.2.
3 Click OK.
Run Simulation
After you define the Model Configuration Parameters, you are ready to simulate your
model.
1 From the Simulink Editor menu bar, select Simulation > Run.
The simulation runs, and then stops when it reaches the stop time specified in the
Model Configuration Parameters dialog box.
The Scope window opens and displays the simulation results. The plot shows a sine wave signal
with the resulting cosine wave signal
37
ETC 324: Information Theory and coding 2018
3 Change the appearance of the display. For example, select white for the display color
and axes background color (icons with a pitcher).
4 Select black for the ticks, labels, and grid colors (icon with a paintbrush).
5 Change signal line colors for the Sine Wave to blue and the Integrator to red. To see
your changes, click OK or Apply.
38
ETC 324: Information Theory and coding 2018
39
ETC 324: Information Theory and coding 2018
13. What is the important property while using the conditional probability (yk / xj)?
Ans: The sum of all the elements along the row side should be equal to 1.
14. What is prefix coding?
Ans: Prefix coding is variable length coding algorithm. It assigns binary digits to the messages
as per their probabilities of occurrence. In prefix code, no codeword is the prefix of any
other codeword.
15. State the channel coding theorem for a discrete memoryless channel.
Ans: Given a source of „M‟ equally likely messages, with M>>1, which is generating
information t a rate R. Given channel with capacity C. Then if, R ≤ C, there exists a
coding technique such that the output of the source may be transmitted over the
channel with probability of error in the received message which may be made
arbitrarily small.
16. Explain channel capacity theorem.
Ans: The channel capacity of the discrete memory less channel is given as maximum
average mutual information. The maximization is taken with respect to input
probabilities P(xi). C = B log2(1+S/N) bits/sec, where B is channel bandwidth.
17. Define mutual information?
Ans: Mutual information of the channel is the average amount of information gained by the
transmitter when the state of the receiver is known.
I(X ;Y ) = H(Y) – H(Y /X ) or = H(X) – H(X /Y )
18. Define channel capacity?
Ans: Channel capacity of a discrete memory less channel can be defined as the maximum
value of the mutual information I ( X;Y ) , Where the maximization is carried out for all
input probabilities {p(xj)} when the symbols whose input probabilities {p(xj)} are
equiprobable.
19. What is the use of error control coding?
Ans: The main use of error control coding is to reduce the overall probability of error, which
is also known as channel coding.
20. What is the difference between systematic code and non-systematic code?
Ans: • If the parity bits are followed by message bits then it is said to be systematic codes.
• If the message bits and parity check bits are randomly arranged then it is said to be non-
systematic codes.
21. What is a Repetition code?
Ans: A single message bit is encoded in to a block of ‘n’ identical bits producing a
(n, 1) block code. There are only two code words in the code. all-zero code word
and all-one code word.
40
ETC 324: Information Theory and coding 2018
41
ETC 324: Information Theory and coding 2018
42
ETC 324: Information Theory and coding 2018
44. For M equally likely messages, M>>1, if the rate of information R ≤ C, the
probability of error is ____.
Ans: very small
45. Code rate r, k information bits and n as total bits, is defined as
Ans: r=k/n
46. The information rate R for given average information H= 2.0 for analog signal
band limited to B Hz is
Ans: 4 bps
47. The expected information contained in a message is called
Ans: Entropy
48. The capacity of Gaussian channel is
Ans: C = B(1+S/N) bits/s
49. According to Shannon Hartley theorem,
Ans: The channel capacity does not become infinite with infinite bandwidth. And
Has a tradeoff between bandwidth and Signal to noise ratio.
50. The negative statement for Shannon's theorem states that
Ans: If R > C, the error probability increases towards Unity
43
ETC 324: Information Theory and coding 2018
44