Upload
gwendoline-lawrence
View
235
Download
0
Tags:
Embed Size (px)
Citation preview
Basic Concepts of Encoding
Codes and Error Correction
1
Encoding
• Encoding is a transformation procedure operating on the input signal prior to its entry into the communication channel. This procedure adapts the input signal to the communication system and improves its efficiency .
2
Encoding
• In other words, encoding is a procedure for associating words constructed from a finite alphabet of a language (e.g. a natural language) with given words of another language (encoding language) in one-to-one manner.
• Decoding is the inverse operation: restoration of words from the initial language.
3
Codes
• Let be the alphabet and its cardinality is .
• Any finite sequence of the letters from this alphabet forms a word over it. Let S be a set of all possible words over A.
• Some of may be meaningful, some of them may not be meaningful, but anyway we will use only some to encode the information.
4
1 2, ,..., nA a a a2A
x S
V S
Codes
• A subset , which is used for representation of the information in the communication system is commonly referred to as the code.
• If all words from V have the same length n, then the code V is called the uniform code.
• If words from V may have different length, then the code V is called the non-uniform code.
5
V S
Digital communications
• Let us consider the digital communication channel.
• Hence . • We will consider the uniform codes of the
length n. Thus, the words over Z2 are n -dimensional binary vectors
• form a set of “encoding” words.
6
2 0,1A Z
1 2,..., , 0,1n iS X x x x Z
X V S
Distance between binary vectors
• The distance ρ (the Hamming distance) between two n -dimensional binary vectors and is the number of components that differ from each other in terms of component-wise comparison.
• To find the distance between two binary vectors, it is necessary to add them component-wise by mod 2 and then to count the number of “1s” in the vector-sum.
7
1,..., nX x x 1,..., nY y y
Distance between binary vectors
• For example,
8
0,1,1,1,1,0 , 0,1,1,0,1,1 ;
0,0,0,1,0,1 ;
, 2
X Y
X Y
X Y
Distance between binary vectors
• The distance ρ meets all metric’s axioms:
• The Hamming’s norm of a binary vector is the number of “1s” in this vector.
9
1) ( , ) 0
2) ( , ) ( , )
3) ( , ) ( , ) ( , )
X X
X Y Y X
X Y X Z Z Y
Errors
• Replacement of one letter in a word by another one is commonly referred to as the error.
• Let the recipient (the receiver) of the information knows the code.
• “Detection of the error” means detection of the fact that the error has occurred without the exact detection of where.
• “Correction of the error” means the complete restoration of a word, which was originally sent, but then was distorted.
10
Errors
• If the word was transmitted and some bits in X were inverted. As a result, the receiver receives .
• If , then the error can not be detected and corrected without analysis of the sense of a whole message.
• If , but , then the error can be detected and corrected upon certain conditions.
11
X V
Y XY V
Y S Y V
Maximum likelihood decoding
• Let X was transmitted, Y was received and• To correct the error (errors) and to decode
the corresponding word we have to find
• This method is called the maximum likelihood decoding
12
Y V
, , min ,y yZ V
Z V Z Y Z Y
Minimum encoding distance• is called a minimum
encoding distance of the code V .• This means that
• In other words, the minimum encoding distance equals to the minimum distance between the encoding vectors
• if the distance between the encoding vector and another vector is less then d then
13
,
min ,X V Y V
X Y
d X Y
, , ,X Y d X V Y V Y S
X VY S Y V
Minimum encoding distance
• For example, let n=3. Then S={000,001,010,011,000,101,110,111}. Let V={000,111}. Then d=3. Indeed, If
14
, , 3 ,X V X Y Y V Y S
Criterion of Error Detection
• Theorem. The uniform code V detects at most t errors if, and only if d=t+1.
• Proof. Let d=t+1. Let XY and . This means that and t errors can be detected.Let the code detects t errors. Then
. Otherwise, if and , this contradicts to the ability to detect t errors.
15
Y V ,X Y t
, , 1X V Y V X Y t ,X Y t
,X V Y V
Example of Error Detection
• For example, let n=3. Then S={000,001,010,011,000,101,110,111}. Let V={000,111}. Then d=3 and we can detect (not correct, just detect!!!) 2 errors.
• Indeed, if any 1 or 2 of 3 bits in any encoding vector is (are) inverted, we obtain a vector, which does not belong to V. If 3 bits are inverted, we obtain another encoding vector and can not detect the errors.
16
Criterion of Error Correction• Theorem. The uniform code V can correct at most t
errors if, and only if d=2t+1.• Proof. Necessity. The code corrects at most t errors.
We have to prove that Suppose that this is not true, which means:
Then, if was transmitted, was received and , we are unable to decode, because X and Y are equidistant to Z. This contradicts to the initial condition that the code corrects up to t errors.
17
. , 2 1X Y V X Y t
. , 2 1 , 2X Y V X Y t X Y t
, 2 ; , ,X Y t X Z Y Z t Z VX V
Criterion of Error Correction
• If was transmitted, was received and
,which means that Y was transmitted. This contradicts to the initial condition that the code corrects up to t errors and therefore it can not be that and this means that and therefore d=2t+1.
18
Z V
, 2 ; , ; ,X Y t X Z t Y Z t
. , 2 1X Y V X Y t
. , 2 1X Y V X Y t
X V
Criterion of Error Correction• Proof. Sufficiency. Let the minimum encoding
distance is d=2t+1. We have to prove that the code can correct up to t errors. Let was transmitted and was received. Suppose that . On the other hand, according to the metric axioms:
If exactly t errors occurred, than X will be decoded. If less then t errors occurred, then, a fortiori, X will be decoded.
19
Z VX V
Y V
2 1
, , , , 1t t
X Y X Z Z Y Z Y t
Example of Error Correction
• For example, let n=3. Then S={000,001,010,011,000,101,110,111}. Let V={000,111}. Then d=3=2*1+1, and we can correct 1 error.
• Indeed, if 1 of 3 bits in any encoding vector is inverted, we obtain a vector, which does not belong to V, and we always can determine a unique vector from V, whose distance to the distorted vector is exactly 1.
20
Example of Error Correction
• S={000,001,010,011,000,101,110,111}. V={000,111}. X1=(000) X2=(111)
• Let X1=(000) was transmitted, Y=(100) was received. and we definitely decode X1.
21
1 2, 1; , 2X Y X Y
Example of Error Correction
• S={000,001,010,011,000,101,110,111}. V={000,111}. X1=(000) X2=(111)
• If 2 of 3 bits in any encoding vector are inverted, we also obtain a vector, which does not belong to V . We can detect that 2 errors occurred, but we can not correct them, because there will be more than one equidistant vector in V, whose distance to the distorted vector is 2.
• Let X1=(000) was transmitted, Y=(101) was received.
and there is no way to correct the errors because the decoding procedure can not be ambiguous.
22
1 2, 2; , 2X Y X Y