39
Turbo Codes Sadeeq ali Mohammad

Turbo Codes

Embed Size (px)

Citation preview

Page 1: Turbo Codes

Turbo Codes

Sadeeq ali Mohammad

Page 2: Turbo Codes

Agenda

Project objectives and motivations Error Correction Codes Turbo Codes Technology Turbo decoding Turbo Codes Performance Turbo Coding Application Conclusion Remarks

Page 3: Turbo Codes

Introduction

Motivation Can we have error free communication

as much as possible. Can we reach Shannon Limit?

Objectives Studying channel coding Understanding channel capacity Ways to increase data rate Provide reliable communication link

Page 4: Turbo Codes

Turbo Codes History

IEEE International Comm conf 1993 in Geneva

Berrou, Glavieux. : ‘Near Shannon Limit Error-Correcting Coding : Turbo codes’

Provided virtually error free communication at data date/power efficiencies beyond most expert though

Page 5: Turbo Codes

Turbo Codes History…

Double data throughput at a given power Or work with half the power The two men were not known, “most were

thinking that they are wrong in calculation”

They realized that it was true. Many companies adopted, new compnaies

started: turboconcept and iCoding 0.5 dB from Shannon limit at Pe 10-6

Page 6: Turbo Codes

Communication System

Structural modular approach Various components Of defined functions

ChannelCoding

Source Coding ModulationFormatting

Digitization Multiplexing Accesstechniques

send

receive

Page 7: Turbo Codes

Channel Coding

Accounting for the channel Can be categorized into

Wave form signal design Better detectible signals.

Structured sequences Added redundancy

Objective: provide coded signals with better distance properties

Page 8: Turbo Codes

Binary Symmetric Channel

Special case of DMC : discrete input and discrete output; where input and output are {0,1}

Memoryless : each symbol is affected d independently

Hard decisions decoding P is related to the bit Energy

1

00

11 - p

1 - p

p

p

Page 9: Turbo Codes

Gaussian Channel

descrete inputs with continuous property

Noise get added to the signals passing through it

Noise is a Gaussian random variable with zero mean and variance σ2

The resulting pdf is

2

2

2)(

21)|(

kuz

k euzp

Likelihood of uk

Page 10: Turbo Codes

Why use ECC

Consider the following trade offs Error performance vs. bandwidth

High redendency consumes bw Power vs. bandwidth

Reduction in Eb/N0

Data rate vs. bandwidth Higher rate

Page 11: Turbo Codes

Shannon Theory

Started the information Theory Stated the max data rate of a

channel Error rate Power

Did not say how! Clue : large data words in term of

number of bits: distance property

Page 12: Turbo Codes

Error Correction Mechanisms

Backward Error correction Error detection capability Communication cost Real time traffic

Forward Error Correction Detection and correction of errors More complex receivers DSP cost

Page 13: Turbo Codes

Forward Error Correction

Block Codes Data split into blocks Checks are within the block

Convolutional code Bit streamed data Involves memory

Turbo codes Uses conv. Codes Special properties

Page 14: Turbo Codes

Structured Redundency

Channel encoderInput word

k-bitOutput word

n-bit

Redundancy = (n-k)Code rate = k/n

codewordCode sequence

Page 15: Turbo Codes

Coding advantages

Pn

Eb/N0 dB

uncoded

coded10-8

10-3

8 19

Coding gain

Page 16: Turbo Codes

Coding disadvantages

More bandwidth due to redundant Processing Delay Design Complexity

Page 17: Turbo Codes

Error Correction

Codewords : points in hyperspace Noise can alter some bits :

displacement If two words are close to each other,

and if an error occurs so that one can fall into the other; decoding error

Keep large differences Decoder complexity !

Page 18: Turbo Codes

Hyperspace and Codewords

Hamming distance

Same word

Page 19: Turbo Codes

Good Codes

Random If we set 1000 bits per word 10301 , astronomical number No way with conventional coding

schemes

Page 20: Turbo Codes

Turbo codes

30 years ago. Forney Nonsystematic Nonrecursive combination of conv.

Encoders Berrou et al at 1993

Recursive Systematic Based on pseudo random Works better for high rates or high level of

noise Return to zero sequences

Page 21: Turbo Codes

Turbo Encoder

Input

RSC

RSC

Interleaver

Systematic codewordrandom

X

Y1

Y2

Page 22: Turbo Codes

Turbo codes

Parallel concatenated The k-bit block is encoded N times with

different versions (order) Pro the sequence remains RTZ is 1/2Nv

Randomness with 2 encoders; error pro of 10-5

Permutations are to fix dmin

Page 23: Turbo Codes

Recursive Systematic Coders

Copy of the data in natural order

Recursive

S1 S2 S3

Data stream

Systematic

Calculated parity bits

Page 24: Turbo Codes

Return to zero sequences

Non recursive encoder state goes to zero after v ‘0’.

RSC goes to zero with P= 1/2v

if one wants to transform conv. into block code; it is automatically built in.

Initial state i will repeat after encoding k

Page 25: Turbo Codes

Convolutional Encoders

Input stream

Modulo-2 adder

Modulo-2 adder

Output serialized

stream4 stage Shift

Page 26: Turbo Codes

Turbo Decoding

Page 27: Turbo Codes

Turbo Decoding

Criterion For n probabilistic processors working

together to estimate common symbols, all of them should agree on the symbols with the probabilities as a single decoder could do

Page 28: Turbo Codes

Turbo Decoder

Page 29: Turbo Codes

Turbo Decoder

The inputs to the decoders are the Log likelihood ratio (LLR) for the individual symbol d.

LLR value for the symbol d is defined ( Berrou) as

Page 30: Turbo Codes

Turbo Decoder

The SISO decoder reevaluates the LLR utilizing the local Y1 and Y2 redundancies to improve the confidence

•The value z is the extrinsic value determined by the same decoder and it is negative if d is 0 and it is

positive if d is 1•The updated LLR is fed into the

other decoder and which calculates the z and updates the LLR for

several iterations•After several iterations , both

decoders converge to a value for that symbol.

Page 31: Turbo Codes

Turbo Decoding

Assume Ui : modulating bit {0,1} Yi : received bit, output of a correlator.

Can take any value (soft). Turbo Decoder input is the log

likelihood ratio R(ui) = log [ P(Yi|Ui=1)/(P(Yi|Ui=0)] For BPSK, R(ui) =2 Yi/ (var)2

For each data bit, calculate the LLR given that a sequence of bit were sent

Page 32: Turbo Codes

Turbo Decoding

Compare the LLR output, to see if the estimate is towards 0 or 1 then take HD

Page 33: Turbo Codes

Soft in/ Soft out processor

At the heart of the decoder Represent all possible states of an

encoder (trellis) Number of states at a particular

clock is 2n ; n = # of flip flops used in the SR

Trellis shows: Current state Possible paths lead to this state

Page 34: Turbo Codes

SISO

Label all branches with a branch metric

Function of processor inputs Obtain the LLR for each data bit by

traversing the Trellis Two algorithms :

Soft output Viterbi Algorithm (SOVA) Maximum a Posteriori (MAP)

Log MAP

Page 35: Turbo Codes

How Do they Work (© IEEE spectrum)

Page 36: Turbo Codes

How Do they Work (© IEEE spectrum)

Page 37: Turbo Codes

Turbo Codes Performance

Page 38: Turbo Codes

Turbo Codes Applications

Deep space exploration France SMART-1 probe

JPL equipped Pathfinder 1997 Mobile 3G systems

In use in Japan UMTS NTT DoCoMo

Turbo codes : pictures/video/mail Convolutional codes : voice

Page 39: Turbo Codes

Conclusion : End of Search

Turbo codes achieved the theorical limits with small gap

Give rise to new codes : Low Density Parity Check (LDPC)

Need Improvements in decoding delay