26
SHANNON INFORMATION THEORY TUTORIAL 7 ENG. SALLY NAFIE

X= {x 0, x 1,….,x J-1 } Y= {y 0, y 1, ….,y K-1 } Channel Finite set of input (X= {x 0, x 1,….,x J-1 }), and output (Y= {y 0, y 1,….,y K-1 }) alphabet

Embed Size (px)

Citation preview

Page 1: X= {x 0, x 1,….,x J-1 } Y= {y 0, y 1, ….,y K-1 } Channel Finite set of input (X= {x 0, x 1,….,x J-1 }), and output (Y= {y 0, y 1,….,y K-1 }) alphabet

SHANNON INFORMATION

THEORY

TUTORIAL 7

ENG. SALLY NAFIE

Page 2: X= {x 0, x 1,….,x J-1 } Y= {y 0, y 1, ….,y K-1 } Channel Finite set of input (X= {x 0, x 1,….,x J-1 }), and output (Y= {y 0, y 1,….,y K-1 }) alphabet

DISCRETE MEMORYLESS CHANNEL:

X= {x0, x1,….,xJ-1} Y= {y0, y1,….,yK-1}Channel

DISCRETE :Finite set of input (X= {x0, x1,….,xJ-1}) , and

output (Y= {y0, y1,….,yK-1}) alphabet.

MEMORYLESS : Current output symbol (yk) depends only on current input

symbol xj

.

Page 3: X= {x 0, x 1,….,x J-1 } Y= {y 0, y 1, ….,y K-1 } Channel Finite set of input (X= {x 0, x 1,….,x J-1 }), and output (Y= {y 0, y 1,….,y K-1 }) alphabet

NOISELESS CHANNEL:

x0 = 0

x1 = 1

y0 = 0

y1 = 1

P(y0/x0) = P(0/0) = 1

P(y1/x1) = P(1/1) = 1

P(y0/x1) = P(0/1) = 0

P(y1/x0) = P(1/0) = 0

CONDITIONAL PROBABILITY

Transmitted Received

Page 4: X= {x 0, x 1,….,x J-1 } Y= {y 0, y 1, ….,y K-1 } Channel Finite set of input (X= {x 0, x 1,….,x J-1 }), and output (Y= {y 0, y 1,….,y K-1 }) alphabet

CONDITIONAL PROBABILITY:

The conditional probability P(yk/xj) is the probability of receiving a certain symbol yk given a certain symbol xj was transmitted.

Ex:In a Noiseless channel: The Probability of receiving a 0 given that a 0 was transmitted

= P(0/0) = 1 The Probability of receiving a 0 given that a 1 was transmitted

= P(0/1) = 0 The Probability of receiving a 1 given that a 0 was transmitted

= P(1/0) = 0 The Probability of receiving a 1 given that a 1 was transmitted

= P(1/1) = 1

Page 5: X= {x 0, x 1,….,x J-1 } Y= {y 0, y 1, ….,y K-1 } Channel Finite set of input (X= {x 0, x 1,….,x J-1 }), and output (Y= {y 0, y 1,….,y K-1 }) alphabet

NOISY CHANNEL:

x0 = 0

x1 = 1

y0 = 0

y1 = 1

P(y0/x0) = P(0/0) = 1- Pe

P(y1/x1) = P(1/1) = 1- Pe

P(y 0/x 1

) = P(0/1) =

P e

P(y1 /x

0 ) = P(1/0) = Pe

Transmitted Received

CONDITIONAL PROBABILITY

Page 6: X= {x 0, x 1,….,x J-1 } Y= {y 0, y 1, ….,y K-1 } Channel Finite set of input (X= {x 0, x 1,….,x J-1 }), and output (Y= {y 0, y 1,….,y K-1 }) alphabet

CHANNEL (TRANSITION) MATRIX:

x0 = 0

x1 = 1

y0 = 0

y1 = 1

P(y0/x0) = P(0/0) = 1- Pe

P(y1/x1) = P(1/1) = 1- Pe

P(y 0/x 1

) = P(0/1) =

P e

P(y1 /x

0 ) = P(1/0) = Pe

Fixed Output

Fixed Input

Page 7: X= {x 0, x 1,….,x J-1 } Y= {y 0, y 1, ….,y K-1 } Channel Finite set of input (X= {x 0, x 1,….,x J-1 }), and output (Y= {y 0, y 1,….,y K-1 }) alphabet

CHANNEL (TRANSITION) MATRIX:

X= {x0, x1,….,xJ-1} Y= {y0, y1,….,yK-1}Channel

Fixed Output

Fixed Input

Page 8: X= {x 0, x 1,….,x J-1 } Y= {y 0, y 1, ….,y K-1 } Channel Finite set of input (X= {x 0, x 1,….,x J-1 }), and output (Y= {y 0, y 1,….,y K-1 }) alphabet

PROBABILITY TERMS:

The probability of the each symbol emitted from the source at the transmitter side.

P(xj) = P(X=xj)

A. PRIOR PROBABILITY:

B. CONDITIONAL PROBABILITY:

The probability of receiving a certain symbol yk given a certain symbol xj was transmitted.

P(yk / xj) = P( Y=yk / X=xj )

Page 9: X= {x 0, x 1,….,x J-1 } Y= {y 0, y 1, ….,y K-1 } Channel Finite set of input (X= {x 0, x 1,….,x J-1 }), and output (Y= {y 0, y 1,….,y K-1 }) alphabet

PROBABILITY TERMS:

C. JOINT PROBABILITY:

The probability of sending a certain symbol xj ,and receiving a certain symbol yk.

P(xj , yk) = P(X= xj, Y=yk)

=P(Y=yk / X= xj) P(X= xj)

=P(yk/xj) P(xj)

PRIOR PROBABILITY

CONDITIONAL PROBABILITY

Page 10: X= {x 0, x 1,….,x J-1 } Y= {y 0, y 1, ….,y K-1 } Channel Finite set of input (X= {x 0, x 1,….,x J-1 }), and output (Y= {y 0, y 1,….,y K-1 }) alphabet

The probability of receiving a certain symbol yk.

P(yk) = P(Y=yk)

=

PROBABILITY TERMS:

D. MARGINAL PROBABILITY:

x0 = 0

x1 = 1

y0 = 0

y1 = 1

OR

Page 11: X= {x 0, x 1,….,x J-1 } Y= {y 0, y 1, ….,y K-1 } Channel Finite set of input (X= {x 0, x 1,….,x J-1 }), and output (Y= {y 0, y 1,….,y K-1 }) alphabet

BAYES’ RULE :

P(xj , yk) = P(yk , xj)

P(yk/xj) P(xj) = P(xj/yk) P(yk)

P(xj/yk) =

=P(xj/yk)

Page 12: X= {x 0, x 1,….,x J-1 } Y= {y 0, y 1, ….,y K-1 } Channel Finite set of input (X= {x 0, x 1,….,x J-1 }), and output (Y= {y 0, y 1,….,y K-1 }) alphabet

CLASSICAL CHANNELS:A. BINARY SYMMETRIC:

x0 = 0

x1 = 1

y0 = 0

y1 = 1

P(y0/x0) = P(0/0) = 1- Pe

P(y1/x1) = P(1/1) = 1- Pe

P(y 0/x 1

) = P(0/1) =

P e

P(y1 /x

0 ) = P(1/0) = Pe

CHANNEL MATRIX:

=

Page 13: X= {x 0, x 1,….,x J-1 } Y= {y 0, y 1, ….,y K-1 } Channel Finite set of input (X= {x 0, x 1,….,x J-1 }), and output (Y= {y 0, y 1,….,y K-1 }) alphabet

CLASSICAL CHANNELS:B. ERASURE CHANNEL:

CHANNEL MATRIX:

=

x0 = 0

x1 = 1

y0 = 0

y1 = 1

P(y0/x0) = P(0/0) = 1- q

P(y2/x

0) = P(e/0) = q

Transmitted Received

y2 = e

P(y1/x1) = P(1/1) = 1- q

P(y2/x1) = P(e/1) = q

Page 14: X= {x 0, x 1,….,x J-1 } Y= {y 0, y 1, ….,y K-1 } Channel Finite set of input (X= {x 0, x 1,….,x J-1 }), and output (Y= {y 0, y 1,….,y K-1 }) alphabet

The average information transmitted over the channel per symbol.

H(X) =

The average information lost due to the channel per symbol, given that a certain symbol yk is received.

H(X/yk) =

SOURCE ENTROPY:

CONDITIONAL ENTROPY:

Page 15: X= {x 0, x 1,….,x J-1 } Y= {y 0, y 1, ….,y K-1 } Channel Finite set of input (X= {x 0, x 1,….,x J-1 }), and output (Y= {y 0, y 1,….,y K-1 }) alphabet

The mean of the entropy over all the received symbols.

CONDITIONAL ENTROPY:

=

= P(xj , yk)

=

H(X/Y) =

H(X/Y) =

H(X/Y)

OR

Equivocation of X with respect to Y

Page 16: X= {x 0, x 1,….,x J-1 } Y= {y 0, y 1, ….,y K-1 } Channel Finite set of input (X= {x 0, x 1,….,x J-1 }), and output (Y= {y 0, y 1,….,y K-1 }) alphabet

SIMILARLY:

=

= P(yk , xj)H(Y/X) =

H(Y/X)

OR

Equivocation of Y with respect to X

=

H(Y/X) =

H(Y/ xj) =

Page 17: X= {x 0, x 1,….,x J-1 } Y= {y 0, y 1, ….,y K-1 } Channel Finite set of input (X= {x 0, x 1,….,x J-1 }), and output (Y= {y 0, y 1,….,y K-1 }) alphabet

The average information the receiver receives per symbol.I(X,Y) = H(X) – H(X/Y)

MUTUAL INFORMATION:

RECEIVED INFORMATION

TRANSMITTEDINFORMATION

LOSTINFORMATION

Page 18: X= {x 0, x 1,….,x J-1 } Y= {y 0, y 1, ….,y K-1 } Channel Finite set of input (X= {x 0, x 1,….,x J-1 }), and output (Y= {y 0, y 1,….,y K-1 }) alphabet

I(X,Y) = H(X) – H(X/Y)

=

-

=1

=

=

=

=

-

-

Page 19: X= {x 0, x 1,….,x J-1 } Y= {y 0, y 1, ….,y K-1 } Channel Finite set of input (X= {x 0, x 1,….,x J-1 }), and output (Y= {y 0, y 1,….,y K-1 }) alphabet

I(X,Y) = H(X) – H(X/Y) I(Y,X) = H(Y) – H(Y/X) I(X,Y) = I(Y,X) I(X,Y) = I (Y, X) = H(X) + H(Y) – H(X,Y)Where

H(X,Y)=

PROPERTIES OF MUTUAL INFORMATION:

I(X,Y)

H(X) H(Y)

H(X/Y) H(Y/X)

H(X,Y)

Page 20: X= {x 0, x 1,….,x J-1 } Y= {y 0, y 1, ….,y K-1 } Channel Finite set of input (X= {x 0, x 1,….,x J-1 }), and output (Y= {y 0, y 1,….,y K-1 }) alphabet

CHANNEL CAPACITY: The channel capacity of a discrete memoryless channel is

defined as the maximum rate at which the information can be transmitted through the channel.

It is the maximum mutual information over all the possible distributions of input probabilities P(xj)

C =

Page 21: X= {x 0, x 1,….,x J-1 } Y= {y 0, y 1, ….,y K-1 } Channel Finite set of input (X= {x 0, x 1,….,x J-1 }), and output (Y= {y 0, y 1,….,y K-1 }) alphabet

CAPACITY OF BINARY SYMMETRIC CHANNEL:

x0 = 0

x1 = 1

y0 = 0

y1 = 1

Pe” = 1- Pe

P e

Pe

Pe” = 1- Pe

P (x0) = P0

I(X,Y) =

P (x1) = P0” = 1- P0

=

+

+

+

k=0, j=0 :

k=0, j=1:

k=1, j=0:

k=1, j=1:

Page 22: X= {x 0, x 1,….,x J-1 } Y= {y 0, y 1, ….,y K-1 } Channel Finite set of input (X= {x 0, x 1,….,x J-1 }), and output (Y= {y 0, y 1,….,y K-1 }) alphabet

+

= +

+

+

-

= -

=

= +

+

+

0 0

=

= +I(X,Y)

Page 23: X= {x 0, x 1,….,x J-1 } Y= {y 0, y 1, ….,y K-1 } Channel Finite set of input (X= {x 0, x 1,….,x J-1 }), and output (Y= {y 0, y 1,….,y K-1 }) alphabet

+ I(X,Y) =

C =

I(X,Y) is maximum when all the transmitted symbols are equiprobable. i.e. P(x0) = P(x1) = 0.5

P0” = 1- P0 = 0.5

AT P0 = 0.5:

C = I(X,Y) = +

C = 1

Page 24: X= {x 0, x 1,….,x J-1 } Y= {y 0, y 1, ….,y K-1 } Channel Finite set of input (X= {x 0, x 1,….,x J-1 }), and output (Y= {y 0, y 1,….,y K-1 }) alphabet

CAPACITY OF ERASURE CHANNEL:

P (x0) = P0

I(X,Y) =

P (x1) = P0” = 1- P0

=

+

+

+

k=0, j=0:

k=0, j=1:

k=1, j=0:

k=1, j=1:

x0 = 0

x1 = 1

y0 = 0

y1 = 1

y2 = e

k=2, j=0:

+k=2, j=1:

+

q” = 1- q

q

q” = 1- q

q

Page 25: X= {x 0, x 1,….,x J-1 } Y= {y 0, y 1, ….,y K-1 } Channel Finite set of input (X= {x 0, x 1,….,x J-1 }), and output (Y= {y 0, y 1,….,y K-1 }) alphabet

0

I(X,Y)

C =

I(X,Y) is maximum when all the transmitted symbols are equiprobable. i.e. P(x0) = P(x1) = 0.5

P0” = 1- P0 = 0.5

AT P0 = 0.5:

C = I(X,Y)

C

0

= + +

+

Page 26: X= {x 0, x 1,….,x J-1 } Y= {y 0, y 1, ….,y K-1 } Channel Finite set of input (X= {x 0, x 1,….,x J-1 }), and output (Y= {y 0, y 1,….,y K-1 }) alphabet

Given J input symbols, and K output symbols , the channel capacity of a symmetric discrete memoryless channel is given by:

ANOTHER METHOD TO OBTAIN CHANNEL CAPACITY: