Documenti di Didattica
Documenti di Professioni
Documenti di Cultura
Agenda
Project objectives and motivations Error Correction Codes Turbo Codes Technology Turbo decoding Turbo Codes Performance Turbo Coding Application Conclusion Remarks
Introduction
Motivation
Can we have error free communication as much as possible. Can we reach Shannon Limit?
Studying channel coding Understanding channel capacity Ways to increase data rate Provide reliable communication link
Objectives
IEEE International Comm conf 1993 in Geneva Berrou, Glavieux. : Near Shannon Limit Error-Correcting Coding : Turbo codes Provided virtually error free communication at data date/power efficiencies beyond most expert though
Double data throughput at a given power Or work with half the power The two men were not known, most were thinking that they are wrong in calculation They realized that it was true. Many companies adopted, new compnaies started: turboconcept and iCoding 0.5 dB from Shannon limit at Pe 10-6
Communication System
Formatting Digitization
Source Coding
Channel Coding
Multiplexing
Modulation
Access techniques
send receive
Channel Coding
Structured sequences
Special case of DMC : discrete input and discrete output; where input and output are {0,1} Memoryless : each symbol is affected d independently Hard decisions decoding 1-p 1 1 P is related to the bit Energy
0
1-p
Gaussian Channel
descrete inputs with continuous property Noise get added to the signals passing through it Noise is a Gaussian random variable with zero mean and variance 2 The resulting pdf is
1 p( z | uk ) e 2
( z u k ) 2 2 2
Likelihood of uk
Reduction in Eb/N0
Higher rate
Shannon Theory
Started the information Theory Stated the max data rate of a channel
Did not say how! Clue : large data words in term of number of bits: distance property
Block Codes
Convolutional code
Turbo codes
Structured Redundency
Channel encoder
codeword
Redundancy = (n-k) Code rate = k/n Code sequence
Coding advantages
Pn 10-3
Coding gain
10-8
19
Eb/N0 dB
Coding disadvantages
Error Correction
Codewords : points in hyperspace Noise can alter some bits : displacement If two words are close to each other, and if an error occurs so that one can fall into the other; decoding error Keep large differences Decoder complexity !
Hamming distance
Same word
Good Codes
Random If we set 1000 bits per word 10301 , astronomical number No way with conventional coding schemes
Turbo codes
Berrou et al at 1993
Turbo Encoder
X
RSC Input
Y1
Interleaver
Y2
Turbo codes
Parallel concatenated
The k-bit block is encoded N times with different versions (order) Pro the sequence remains RTZ is 1/2Nv Randomness with 2 encoders; error pro of 10-5 Permutations are to fix dmin
Non recursive encoder state goes to zero after v 0. RSC goes to zero with P= 1/2v if one wants to transform conv. into block code; it is automatically built in. Initial state i will repeat after encoding k
Convolutional Encoders
Modulo-2 adder
Turbo Decoding
Turbo Decoding
Criterion
For n probabilistic processors working together to estimate common symbols, all of them should agree on the symbols with the probabilities as a single decoder could do
Turbo Decoder
Turbo Decoder
The inputs to the decoders are the Log likelihood ratio (LLR) for the individual symbol d. LLR value for the symbol d is defined ( Berrou) as
Turbo Decoder
The SISO decoder reevaluates the LLR utilizing the The value z is the extrinsic value local Y1 and Y2 determined by the same decoder if d is 0 redundancies to and it is negative if d is 1 and it is positive improve the The updated LLR is fed into the other decoder and which calculates confidence
the z and updates the LLR for several iterations After several iterations , both decoders converge to a value for that symbol.
Turbo Decoding
Assume
Ui : modulating bit {0,1} Yi : received bit, output of a correlator. Can take any value (soft). Turbo Decoder input is the log likelihood ratio
For each data bit, calculate the LLR given that a sequence of bit were sent
R(ui) = log [ P(Yi|Ui=1)/(P(Yi|Ui=0)] For BPSK, R(ui) =2 Yi/ (var)2
Turbo Decoding
Compare the LLR output, to see if the estimate is towards 0 or 1 then take HD
At the heart of the decoder Represent all possible states of an encoder (trellis) Number of states at a particular clock is 2n ; n = # of flip flops used in the SR Trellis shows:
SISO
Label all branches with a branch metric Function of processor inputs Obtain the LLR for each data bit by traversing the Trellis Two algorithms :
Log MAP
Turbo codes achieved the theorical limits with small gap Give rise to new codes : Low Density Parity Check (LDPC) Need