Upload
others
View
4
Download
1
Embed Size (px)
Citation preview
Comm. 502: Communication Theory
Lecture 9
Introduction to Channel Coding
Digital Communication Systems
Source of Information
User of Information
Source Encoder
Channel Encoder
Modulator
Source Decoder
Channel Decoder
De-Modulator
Channel
Motivation for Channel Coding
• Pr{B*≠B}=p • For a relatively noisy channel, p (i.e., probability of error) may have a value
of 10-2 • For many applications, this is not acceptable
– Examples: • Speech Requirement: Pr{B*≠B}<10-3
• Data Requirement: Pr{B*≠B}<10-6
• Channel coding can help to achieve such a high level of performance
B B*
0 0
11
Physical Channel
p
p
(1-p)
(1-p)
)3/1Example: Repetition Code (r=
• Majority Rule Channel Decoder
Channel Encoder 0 000
Channel Encoder 1 111
Channel Decoder 010 0
Channel Decoder 110 1
Channel Coding
B1B2.. Bk
Channel Decoder
0 0
11
Physical Channel
p
p
(1-p)
(1-p)
Channel Encoder
• Channel Encoder: Mapping of k information bits into an n-bit code word
• Channel Decoder: Inverse mapping of n received code bits back to k information bits
• Code Rate: r=k/n and r<1 (Less than One) • Information Bit Error: B*i≠ Bi
• Coded Bit Error: W*i≠ Wi
W1W2.. Wn W*1W*2.. W*n B*1B*2.. B*k
Physical Channel
Linear Block codes Binary • A code is said to be linear if any two code words in the code
can be added in modulo-2 addition to produce a third code word in the code.
• A binary block code generates a block of n coded bits from k information bits. This is called (n,k) binary block code.
• The Linear block codes are generated using the Generator matrix G defined as:
matrixidentity theis I
matrixtcoefficientheiswhere
],|[ ,
P
PIG knkk
Example: Hamming Code • Hamming distance (d): The Hamming distance between two code
words is the number of elements in which they differ.
• Example: C1=[00101] and C2=[10011], d=3
• The weight of a code word is defined as the number of 1-bits in the codeword so C1 has weight 2.
• The minimum distance is defined as the smallest Hamming distance between any pair of code words in the code. Then, it is also the smallest Hamming weight of the non-zero code words in the code.
Example: Hamming Code
• The minimum distance of a linear block code is an important parameter of the code. It determines the error correcting capability of the code:
• An (n,k) linear block code of minimum distance dmin can correct up to t errors iff:
• Hamming codes satisfy the previous equation with equality sign.
quantity.enclosedthetoequalorthan
lessintegerlargestthedenoteswhere
12
1min
dt
) Hamming Code3,6Example: (
W BG
Generator Matrix
Channel Encoder
Example
101
110
011
|
|
|
100
010
001
G
]|[:
],|[
33
,
PIG
P
PIG
Ex
matrixtcoefficientheis
knkk
Coded words
Information
bits
000000 000
001101 001
010011 010
011110 011
100110 100
101011 101
110101 110
111000 111
]110101[5
101
110
011
|
|
|
100
010
001
]101[5
w
w
iwn=6, k=3
....,, 1100 GG bwbw
ib
Hamming Code Decoder
1 0 1 1 0 0
H 1 1 0 0 1 0
0 1 1 0 0 1
Parity Check Matrix * TX WH
Channel Decoder
Valid Code word * TWH 0
Error Correction * TWH 0
X is called the Syndrome
]|[ knT
IPH
...,, 2211
TT wxwx HH
Example
• If no error occurs:
1 0 1 1 0 0
H 1 1 0 0 1 0
0 1 1 0 0 1
]000[
100
010
001
101
110
011
]110101[5
x
110
000
101
011
100
010
001
101
110
011
TH
* TX WH
]|[ knT
IPH
Example: Error Detection
W 1 0 1 0 1 1 *W 1 0 1 0 0 1
1 1 0
0 1 1
1 0 1X 1 0 1 0 0 1 0 1 0
1 0 0
0 1 0
0 0 1
5 5
5
If error occurs:
error occurs * TWH 0
Correcting Single Bit Errors
Syndrome (X) Error Pattern (E)
000 000000
110 100000
011 010000
101 001000
100 000100
010 000010
001 000001
*W W E
* T
T T
T
X W H
X WH EH
X EH
This table is constructed by evaluating
X for each row of E. We move the 1 in
each row of E one place.
Correcting Single Bit Errors
• Which Error Pattern generates this syndrome?
W 1 0 1 0 1 1 *W 1 0 1 0 0 1
1 1 0
0 1 1
1 0 1X 1 0 1 0 0 1 0 1 0
1 0 0
0 1 0
0 0 1
E 0 0 0 0 1 0 Correction
Change the fifth bit (from 0 to 1)
5 5
5
Random Errors and Burst Errors • Random Errors
– The probability of error in consecutive bits is independent • Burst Errors
– There is a correlation in error probability for consecutive bits • Illustrative Example
– Assume the scenario when 3 bit errors occur within three transmitted code words
– Assume the code can correct only a single error
3 Errors 3 Errors
In the case of random errors, errors are more likely to be distributed over multiple code words. In the Scenario shown the decoder will be able to correct all three code words.
In the case of burst errors, errors are more likely to be packed within the same code word. In the Scenario shown the decoder will not be able to correct all the middle code word.
Interleaving
B B* Channel Decoder
0 0
11
Physical Channel
p
p
(1-p)
(1-p)
W* W Channel Encoder
Interleaver De-Interleaver
Reorder the bits to distribute burst errors over multiple code words
Example: Repetition Code
1/3 Repetition Encoder
0 0 0 1 1 1 0 0 0
0 1 0
This code repeats the bit n times
For n=3
Illustrative Example
1/3 Repetition Encoder
0 0 0 1 1 1 0 0 0
0 1 0
1 2 3 4 5 6 7 8 9
Interleaver
0 1 0 0 1 0 0 1 0
0 1 0 1 0 1 0 1 0
Burst Error
1 4 7 2 5 8 3 6 9
De-Interleaver
1 4 7 2 5 8 3 6 9 1/3 Repetition Decoder
0 1 0
0 1 0 1 0 1 1 0 0
Use Error correction to
correct the one error per word
Index of the bit
Overall Digital Communications Block Diagram