Sei sulla pagina 1di 39

Turbo Codes

Sadeeq ali Mohammad

Agenda

Project objectives and motivations Error Correction Codes Turbo Codes Technology Turbo decoding Turbo Codes Performance Turbo Coding Application Conclusion Remarks

Introduction

Motivation

Can we have error free communication as much as possible. Can we reach Shannon Limit?
Studying channel coding Understanding channel capacity Ways to increase data rate Provide reliable communication link

Objectives

Turbo Codes History


IEEE International Comm conf 1993 in Geneva Berrou, Glavieux. : Near Shannon Limit Error-Correcting Coding : Turbo codes Provided virtually error free communication at data date/power efficiencies beyond most expert though

Turbo Codes History

Double data throughput at a given power Or work with half the power The two men were not known, most were thinking that they are wrong in calculation They realized that it was true. Many companies adopted, new compnaies started: turboconcept and iCoding 0.5 dB from Shannon limit at Pe 10-6

Communication System

Structural modular approach Various components Of defined functions

Formatting Digitization

Source Coding

Channel Coding

Multiplexing

Modulation

Access techniques

send receive

Channel Coding

Accounting for the channel Can be categorized into

Wave form signal design

Better detectible signals. Added redundancy

Structured sequences

Objective: provide coded signals with better distance properties

Binary Symmetric Channel

Special case of DMC : discrete input and discrete output; where input and output are {0,1} Memoryless : each symbol is affected d independently Hard decisions decoding 1-p 1 1 P is related to the bit Energy
0

1-p

Gaussian Channel

descrete inputs with continuous property Noise get added to the signals passing through it Noise is a Gaussian random variable with zero mean and variance 2 The resulting pdf is

1 p( z | uk ) e 2

( z u k ) 2 2 2

Likelihood of uk

Why use ECC

Consider the following trade offs

Error performance vs. bandwidth

High redendency consumes bw

Power vs. bandwidth

Reduction in Eb/N0
Higher rate

Data rate vs. bandwidth

Shannon Theory

Started the information Theory Stated the max data rate of a channel

Error rate Power

Did not say how! Clue : large data words in term of number of bits: distance property

Error Correction Mechanisms

Backward Error correction

Error detection capability Communication cost Real time traffic


Detection and correction of errors More complex receivers DSP cost

Forward Error Correction


Forward Error Correction

Block Codes

Data split into blocks Checks are within the block


Bit streamed data Involves memory Uses conv. Codes Special properties

Convolutional code

Turbo codes

Structured Redundency

Input word k-bit

Channel encoder

Output word n-bit

codeword
Redundancy = (n-k) Code rate = k/n Code sequence

Coding advantages
Pn 10-3

Coding gain

10-8

19

Eb/N0 dB

Coding disadvantages

More bandwidth due to redundant Processing Delay Design Complexity

Error Correction

Codewords : points in hyperspace Noise can alter some bits : displacement If two words are close to each other, and if an error occurs so that one can fall into the other; decoding error Keep large differences Decoder complexity !

Hyperspace and Codewords

Hamming distance

Same word

Good Codes

Random If we set 1000 bits per word 10301 , astronomical number No way with conventional coding schemes

Turbo codes

30 years ago. Forney

Nonsystematic Nonrecursive combination of conv. Encoders


Recursive Systematic
Based on pseudo random Works better for high rates or high level of noise Return to zero sequences

Berrou et al at 1993

Turbo Encoder
X
RSC Input

Y1

Interleaver

random Systematic codeword RSC

Y2

Turbo codes

Parallel concatenated

The k-bit block is encoded N times with different versions (order) Pro the sequence remains RTZ is 1/2Nv Randomness with 2 encoders; error pro of 10-5 Permutations are to fix dmin

Recursive Systematic Coders


Systematic Copy of the data in natural order

Recursive S1 Data stream S2 S3

Calculated parity bits

Return to zero sequences

Non recursive encoder state goes to zero after v 0. RSC goes to zero with P= 1/2v if one wants to transform conv. into block code; it is automatically built in. Initial state i will repeat after encoding k

Convolutional Encoders

Modulo-2 adder

Input stream 4 stage Shift Modulo-2 adder

Output serialized stream

Turbo Decoding

Turbo Decoding

Criterion

For n probabilistic processors working together to estimate common symbols, all of them should agree on the symbols with the probabilities as a single decoder could do

Turbo Decoder

Turbo Decoder

The inputs to the decoders are the Log likelihood ratio (LLR) for the individual symbol d. LLR value for the symbol d is defined ( Berrou) as

Turbo Decoder

The SISO decoder reevaluates the LLR utilizing the The value z is the extrinsic value local Y1 and Y2 determined by the same decoder if d is 0 redundancies to and it is negative if d is 1 and it is positive improve the The updated LLR is fed into the other decoder and which calculates confidence
the z and updates the LLR for several iterations After several iterations , both decoders converge to a value for that symbol.

Turbo Decoding

Assume

Ui : modulating bit {0,1} Yi : received bit, output of a correlator. Can take any value (soft). Turbo Decoder input is the log likelihood ratio
For each data bit, calculate the LLR given that a sequence of bit were sent
R(ui) = log [ P(Yi|Ui=1)/(P(Yi|Ui=0)] For BPSK, R(ui) =2 Yi/ (var)2

Turbo Decoding

Compare the LLR output, to see if the estimate is towards 0 or 1 then take HD

Soft in/ Soft out processor

At the heart of the decoder Represent all possible states of an encoder (trellis) Number of states at a particular clock is 2n ; n = # of flip flops used in the SR Trellis shows:

Current state Possible paths lead to this state

SISO

Label all branches with a branch metric Function of processor inputs Obtain the LLR for each data bit by traversing the Trellis Two algorithms :

Soft output Viterbi Algorithm (SOVA) Maximum a Posteriori (MAP)

Log MAP

How Do they Work ( IEEE spectrum)

How Do they Work ( IEEE spectrum)

Turbo Codes Performance

Turbo Codes Applications


Deep space exploration

France SMART-1 probe

JPL equipped Pathfinder 1997 Mobile 3G systems


In use in Japan UMTS NTT DoCoMo


Turbo codes : pictures/video/mail Convolutional codes : voice

Conclusion : End of Search

Turbo codes achieved the theorical limits with small gap Give rise to new codes : Low Density Parity Check (LDPC) Need

Improvements in decoding delay

Potrebbero piacerti anche