Convolutional CodesRepresentation and Encoding
Many known codes can be modified by an extra code symbol or by
deleting a symbol
* Can create codes of almost any desired rate
* Can create codes with slightly improved performance
The resulting code can usually be decoded with only a slight
modification to the decoder algorithm.
Sometimes modification process can be applied multiple times in
succession
Error Control Coding , © Brian D. Woerner , reproduced by: Erhan A. INCE
Modification to Known Codes
1. Puncturing: delete a parity symbol (n,k) code (n-1,k) code
2. Shortening: delete a message symbol (n,k) code (n-1,k-1) code
3. Expurgating: delete some subset of codewords (n,k) code (n,k-1) code
4. Extending: add an additional parity symbol (n,k) code (n+1,k) code
Error Control Coding , © Brian D. Woerner , reproduced by: Erhan A. INCE
Modification to Known Codes…5. Lengthening: add an additional message symbol
(n,k) code (n+1,k+1) code
6. Augmenting: add a subset of additional code words (n,k) code (n,k+1) code
Error Control Coding , © Brian D. Woerner , reproduced by: Erhan A. INCE
Interleaving We have assumed so far that bit errors are independent from one
bit to the next In mobile radio, fading makes bursts of error likely. Interleaving is used to try to make these errors independent again
Error Control Coding , © Brian D. Woerner , reproduced by: Erhan A. INCE
Depth Of Interleaving
1 1
6 2
11 3
16 4
21 5
31 7
26 6
5 29
10 30
30 34
35 35
Length
Order Bits
Transmitted
Order Bits
Received
Concatenated Codes Two levels of coding
Achieves performance of very long code rates while maintaining
shorter decoding complexity
Overall rate is product of individual code rates
Codeword error occurs if both codes fail.
Error probability is found by first evaluating the error probability of
“inner” decoder and then evaluating the error probability of “outer”
decoder.
Interleaving is always used with concatenated coding
Error Control Coding , © Brian D. Woerner , reproduced by: Erhan A. INCE
Block Diagram of Concatenated Coding Systems
Error Control Coding , © Brian D. Woerner , reproduced by: Erhan A. INCE
DataBits
Outer
Encoder Interleave Inner
Encoder Modulator
Channel
De-Modulator Inner
Decoder De-Interleave
Outer
Decoder
DataOut
Practical Application : Coding for CD
•Each channel is sampled at 44000 samples/second
•Each sample is quantized with 16 bits
Uses a concatenated RS code Both codes constructed over GF(256) (8-bits/symbol) Outer code is a (28,24) shortened RS code Inner code is a (32,28) extended RS code In between coders is a (28,4) cross-interleaver Overall code rate is r = 0.75
Most commercial CD players don’t exploit full power of the error correction coder
Error Control Coding , © Brian D. Woerner , reproduced by: Erhan A. INCE
Practical Application: Galileo Deep Space Probe
Uses concatenated coding Inner code rate is ½, constraint length 7 convolutinal encoder
Outer Code (255,223) RS code over GF(256) – corrects any burst errors from convolutional codes
Overall Code Rate is r= 0.437
A block interleaver held 2RS Code words
Deep space channel is severely energy limited but not bandwidth limited
Error Control Coding , © Brian D. Woerner , reproduced by: Erhan A. INCE
IS-95 CDMA The IS-95 standard employs the rate (64,6) orthogonal (Walsh)
code on the reverse link
The inner Walsh Code is concatenated with a rate 1/3, constraint length 9 convolutional code
Error Control Coding , © Brian D. Woerner , reproduced by: Erhan A. INCE
Data Transmission in a 3rd Generation PCS
Proposed ETSI standard employs RS Codes concatenated with
convolutional codes for data communication
Requirements; Ber of the order of 10-6
Moderate Latency is acceptable
CDMA2000 uses turbo codes for data transmission
ETSI has optional provisions for Turbo Coding
Error Control Coding , © Brian D. Woerner , reproduced by: Erhan A. INCE
A Common Theme from Coding Theory
The real issue is the complexity of the decoder.
For a binary code, we must match 2n possible received sequences with code words
Only a few practical decoding algorithms have been found:
Berlekamp-Massey algorithm for clock codes
Viterbi algorithm (and similar technique) for
convolutional codes
Code designers have focused on finding new codes that work with known algorithms
Error Control Coding , © Brian D. Woerner , reproduced by: Erhan A. INCE
Block Versus Convolutional Codes
Block codes take k input bits and produce n output bits, where k and n are large
there is no data dependency between blocks
useful for data communcations
Convolutional codes take a small number of input bits and produce a
small number of output bits each time period
data passes through convolutional codes in a continuous stream
useful for low- latency communications
Error Control Coding , © Brian D. Woerner , reproduced by: Erhan A. INCE
Convolutional Codes
k bits are input, n bits are output
Now k & n are very small (usually k=1-3, n=2-6)
Input depends not only on current set of k input bits, but also on past
input.
The number of bits which input depends on is called the "constraint
length" K.
Frequently, we will see that k=1
Error Control Coding , © Brian D. Woerner , reproduced by: Erhan A. INCE
Example of Convolutional Code
k=1, n=2, K=3 convolutional code
Error Control Coding , © Brian D. Woerner , reproduced by: Erhan A. INCE
Example of Convolutional Code
k=2, n=3, K=2 convolutional code
Error Control Coding , © Brian D. Woerner , reproduced by: Erhan A. INCE
Representations of Convolutional Codes
Encoder Block Diagram (shown above)
Generator Representation
Trellis Representation
State Diagram Representation
Error Control Coding , © Brian D. Woerner , reproduced by: Erhan A. INCE
Convolutional Code Generators One generator vector for each of the n output bits:
The length of the generator vector for a rate r=k/n
code with constraint length K is K
The bits in the generator from left to right represent the
connections in the encoder circuit. A “1” represents a link from
the shift register. A “0” represents no link.
Encoder vectors are often given in octal representation
Error Control Coding , © Brian D. Woerner , reproduced by: Erhan A. INCE
Example of Convolutional Code
k=1, n=2, K=3 convolutional code
Error Control Coding , © Brian D. Woerner , reproduced by: Erhan A. INCE
Example of Convolutional Code
k=2, n=3, K=2 convolutional code
Error Control Coding , © Brian D. Woerner , reproduced by: Erhan A. INCE
State Diagram Representation
Contents of shift registers make up "state" of code:
Most recent input is most significant bit of state.
Oldest input is least significant bit of state.
(this convention is sometimes reverse)
Arcs connecting states represent allowable transitions
Arcs are labeled with output bits transmitted during transition
Error Control Coding , © Brian D. Woerner , reproduced by: Erhan A. INCE
Example of State Diagram Representation
Of Convolutional Codes
k=1, n=2, K=3 convolutional code
Error Control Coding , © Brian D. Woerner , reproduced by: Erhan A. INCE
Trellis Representation of Convolutional Code
State diagram is “unfolded” a function of time
Time indicated by movement towards right
Contents of shift registers make up "state" of code:
Most recent input is most significant bit of state.
Oldest input is least significant bit of state.
Allowable transitions are denoted by connects between
states
transitions may be labeled with transmitted bits
Error Control Coding , © Brian D. Woerner , reproduced by: Erhan A. INCE
Example of Trellis Diagram
k=1, n=2, K=3 convolutional code
Error Control Coding , © Brian D. Woerner , reproduced by: Erhan A. INCE
Encoding Example Using Trellis Representation
k=1, n=2, K=3 convolutional code
We begin in state 00:
Input Data: 0 1 0 1 1 0 0
Output: 0 0 1 1 0 1 0 0 10 10 1 1
Error Control Coding , © Brian D. Woerner , reproduced by: Erhan A. INCE
Distance Structure of a Convolutional Code
The Hamming Distance between any two distinct code sequences
and is the number of bits in which they differ:
The minimum free Hamming distance dfree of a convolutional code is the smallest Hamming distance separating any two distinct code sequences:
Cc 1_ Cc
2_
i i
ci
cccH
d,2,12,1
jcicH
djifree
d ,min
Error Control Coding , © Brian D. Woerner , reproduced by: Erhan A. INCE
Search for good codes
We would like convolutional codes with large free distance
must avoid “catastrophic codes”
Generators for best convolutional codes are generally found via computer search
search is constrained to codes with regular structure
search is simplified because any permutation of identical
generators is equivalent
search is simplified because of linearity.
Error Control Coding , © Brian D. Woerner , reproduced by: Erhan A. INCE
Best Rate 1/2 Codes
Error Control Coding , © Brian D. Woerner , reproduced by: Erhan A. INCE
Best Rate 1/3 Codes
Error Control Coding , © Brian D. Woerner , reproduced by: Erhan A. INCE
Best Rate 2/3 Codes
Error Control Coding , © Brian D. Woerner , reproduced by: Erhan A. INCE
Summary of Convolutional Codes
Convolutional Codes are useful for real-time applications because
they can be continously encoded and decoded
We can represent convolutional codes as generators, block
diagrams, state diagrams, and trellis diagrams
We want to design convolutional codes to maximize free distance
while maintaining non-catastrophic performance