Upload
brook-kennedy
View
239
Download
2
Embed Size (px)
Citation preview
SHANNON INFORMATION
THEORY
TUTORIAL 7
ENG. SALLY NAFIE
DISCRETE MEMORYLESS CHANNEL:
X= {x0, x1,….,xJ-1} Y= {y0, y1,….,yK-1}Channel
DISCRETE :Finite set of input (X= {x0, x1,….,xJ-1}) , and
output (Y= {y0, y1,….,yK-1}) alphabet.
MEMORYLESS : Current output symbol (yk) depends only on current input
symbol xj
.
NOISELESS CHANNEL:
x0 = 0
x1 = 1
y0 = 0
y1 = 1
P(y0/x0) = P(0/0) = 1
P(y1/x1) = P(1/1) = 1
P(y0/x1) = P(0/1) = 0
P(y1/x0) = P(1/0) = 0
CONDITIONAL PROBABILITY
Transmitted Received
CONDITIONAL PROBABILITY:
The conditional probability P(yk/xj) is the probability of receiving a certain symbol yk given a certain symbol xj was transmitted.
Ex:In a Noiseless channel: The Probability of receiving a 0 given that a 0 was transmitted
= P(0/0) = 1 The Probability of receiving a 0 given that a 1 was transmitted
= P(0/1) = 0 The Probability of receiving a 1 given that a 0 was transmitted
= P(1/0) = 0 The Probability of receiving a 1 given that a 1 was transmitted
= P(1/1) = 1
NOISY CHANNEL:
x0 = 0
x1 = 1
y0 = 0
y1 = 1
P(y0/x0) = P(0/0) = 1- Pe
P(y1/x1) = P(1/1) = 1- Pe
P(y 0/x 1
) = P(0/1) =
P e
P(y1 /x
0 ) = P(1/0) = Pe
Transmitted Received
CONDITIONAL PROBABILITY
CHANNEL (TRANSITION) MATRIX:
x0 = 0
x1 = 1
y0 = 0
y1 = 1
P(y0/x0) = P(0/0) = 1- Pe
P(y1/x1) = P(1/1) = 1- Pe
P(y 0/x 1
) = P(0/1) =
P e
P(y1 /x
0 ) = P(1/0) = Pe
Fixed Output
Fixed Input
CHANNEL (TRANSITION) MATRIX:
X= {x0, x1,….,xJ-1} Y= {y0, y1,….,yK-1}Channel
Fixed Output
Fixed Input
PROBABILITY TERMS:
The probability of the each symbol emitted from the source at the transmitter side.
P(xj) = P(X=xj)
A. PRIOR PROBABILITY:
B. CONDITIONAL PROBABILITY:
The probability of receiving a certain symbol yk given a certain symbol xj was transmitted.
P(yk / xj) = P( Y=yk / X=xj )
PROBABILITY TERMS:
C. JOINT PROBABILITY:
The probability of sending a certain symbol xj ,and receiving a certain symbol yk.
P(xj , yk) = P(X= xj, Y=yk)
=P(Y=yk / X= xj) P(X= xj)
=P(yk/xj) P(xj)
PRIOR PROBABILITY
CONDITIONAL PROBABILITY
The probability of receiving a certain symbol yk.
P(yk) = P(Y=yk)
=
PROBABILITY TERMS:
D. MARGINAL PROBABILITY:
x0 = 0
x1 = 1
y0 = 0
y1 = 1
OR
BAYES’ RULE :
P(xj , yk) = P(yk , xj)
P(yk/xj) P(xj) = P(xj/yk) P(yk)
P(xj/yk) =
=P(xj/yk)
CLASSICAL CHANNELS:A. BINARY SYMMETRIC:
x0 = 0
x1 = 1
y0 = 0
y1 = 1
P(y0/x0) = P(0/0) = 1- Pe
P(y1/x1) = P(1/1) = 1- Pe
P(y 0/x 1
) = P(0/1) =
P e
P(y1 /x
0 ) = P(1/0) = Pe
CHANNEL MATRIX:
=
CLASSICAL CHANNELS:B. ERASURE CHANNEL:
CHANNEL MATRIX:
=
x0 = 0
x1 = 1
y0 = 0
y1 = 1
P(y0/x0) = P(0/0) = 1- q
P(y2/x
0) = P(e/0) = q
Transmitted Received
y2 = e
P(y1/x1) = P(1/1) = 1- q
P(y2/x1) = P(e/1) = q
The average information transmitted over the channel per symbol.
H(X) =
The average information lost due to the channel per symbol, given that a certain symbol yk is received.
H(X/yk) =
SOURCE ENTROPY:
CONDITIONAL ENTROPY:
The mean of the entropy over all the received symbols.
CONDITIONAL ENTROPY:
=
= P(xj , yk)
=
H(X/Y) =
H(X/Y) =
H(X/Y)
OR
Equivocation of X with respect to Y
SIMILARLY:
=
= P(yk , xj)H(Y/X) =
H(Y/X)
OR
Equivocation of Y with respect to X
=
H(Y/X) =
H(Y/ xj) =
The average information the receiver receives per symbol.I(X,Y) = H(X) – H(X/Y)
MUTUAL INFORMATION:
RECEIVED INFORMATION
TRANSMITTEDINFORMATION
LOSTINFORMATION
I(X,Y) = H(X) – H(X/Y)
=
-
=1
=
=
=
=
-
-
I(X,Y) = H(X) – H(X/Y) I(Y,X) = H(Y) – H(Y/X) I(X,Y) = I(Y,X) I(X,Y) = I (Y, X) = H(X) + H(Y) – H(X,Y)Where
H(X,Y)=
PROPERTIES OF MUTUAL INFORMATION:
I(X,Y)
H(X) H(Y)
H(X/Y) H(Y/X)
H(X,Y)
CHANNEL CAPACITY: The channel capacity of a discrete memoryless channel is
defined as the maximum rate at which the information can be transmitted through the channel.
It is the maximum mutual information over all the possible distributions of input probabilities P(xj)
C =
CAPACITY OF BINARY SYMMETRIC CHANNEL:
x0 = 0
x1 = 1
y0 = 0
y1 = 1
Pe” = 1- Pe
P e
Pe
Pe” = 1- Pe
P (x0) = P0
I(X,Y) =
P (x1) = P0” = 1- P0
=
+
+
+
k=0, j=0 :
k=0, j=1:
k=1, j=0:
k=1, j=1:
+
= +
+
+
-
= -
=
= +
+
+
0 0
=
= +I(X,Y)
+ I(X,Y) =
C =
I(X,Y) is maximum when all the transmitted symbols are equiprobable. i.e. P(x0) = P(x1) = 0.5
P0” = 1- P0 = 0.5
AT P0 = 0.5:
C = I(X,Y) = +
C = 1
CAPACITY OF ERASURE CHANNEL:
P (x0) = P0
I(X,Y) =
P (x1) = P0” = 1- P0
=
+
+
+
k=0, j=0:
k=0, j=1:
k=1, j=0:
k=1, j=1:
x0 = 0
x1 = 1
y0 = 0
y1 = 1
y2 = e
k=2, j=0:
+k=2, j=1:
+
q” = 1- q
q
q” = 1- q
q
0
I(X,Y)
C =
I(X,Y) is maximum when all the transmitted symbols are equiprobable. i.e. P(x0) = P(x1) = 0.5
P0” = 1- P0 = 0.5
AT P0 = 0.5:
C = I(X,Y)
C
0
= + +
+
Given J input symbols, and K output symbols , the channel capacity of a symmetric discrete memoryless channel is given by:
ANOTHER METHOD TO OBTAIN CHANNEL CAPACITY: