§2 Discrete memoryless channels and their capacity function §2.1 Channel capacity §2.2 The...

Preview:

Citation preview

§2 Discrete memoryless channels and their capacity function

§2.1 Channel capacity

§2.2 The channel coding theorem

§2.1 Channel capacity

§2.2 The channel coding theorem

);(max)(

YXICxP

R = I(X;Y) = H(X) – H(X|Y)Rt = I(X;Y)/t

§2.1 Channel capacity

1. Definition

Definition:

Let any discrete random variables X and Y, channel capacity is defined as

Review:

bit/sig

bit/second

)};({max1

)(YXI

tC

xPt

0

1

0

1

1-p

1-p

p

p

Example 2.1.1

Compute the capacity ofBSC?

§2.1 Channel capacity

1. Definition

C=1-H(p)

Solution:

2. Simple discrete channel

100

010

001

P

1a1b

2a2b

3a3b

( ) ( )max{ ( ; )} max{ ( )} log ( / )

P x P xC I X Y H X r bit sig

One to one channel

§2.1 Channel capacity

loseless channel

100000

010

1

10

3

5

300

00002

1

2

1

P

1a1b

2b

2a

3b

4b

5b

3a6b

( ) ( )max{ ( ; )} max{ ( )} log ( / )

P x P xC I X Y H X r bit sig

2. Simple discrete channel

§2.1 Channel capacity

noiseless channel

100

100

010

010

010

001

P

1a

1b

2b

2a

3b

3a

4a

5a

6a

( ) ( )max{ ( ; )} max{ ( )} log ( / )

P x P xC I X Y H Y s bit sig

2. Simple discrete channel

§2.1 Channel capacity

3. Symmetric channel

3

1

3

1

6

1

6

16

1

6

1

3

1

3

1

P

§2.1 Channel capacity

definition: If the transition matrix P is as follow, it’s a symmetric channel:1) Every row of P can be a permutation of the first row;2) Every column of P can be a permutation of the first column.

{p1’p2’… ps’}

{q1’q2’… qr’}

3. Symmetric channel

§2.1 Channel capacity

Properties:

1) H(Y|X)=H(Y|X=ai)=H(p1’, p2’,…,ps’), i=1,2,…,r;

2) If the input random variable X has equal probabilities, then the output random variable Y has equal probabilities.

1 2log ( , , ..., )sC s H p p p

Example 2.1.2

pr

p

r

p

r

pp

r

pr

p

r

pp

1...11

....1

...11

1...

11

3. Symmetric channel

Strongly symmetric channel

§2.1 Channel capacity

log ( ) log( 1)C r H p p r

Example 2.1.3

6

1

3

1

3

1

6

16

1

3

1

6

1

3

1

P

3. Symmetric channel

Weakly symmetric channel

§2.1 Channel capacity

3. Symmetric channel

characteristic of weakly symmetric channel

The columns of its transition matrix P can be partitioned into subsets Ci such that , for each i, in the matrix Pi formed by the columns in Ci, each row is a permutation of every other row, and the same is true of columns.

§2.1 Channel capacity

(P68 :T2.2 in textbook)

3. Symmetric channel

Weakly symmetric channel

§2.1 Channel capacity

0

1

0

1

e

1-p

1-p

p

p

Example 2.1.4

Compute the capacity ofBEC?

pp

ppP

10

01

1C p

3. Symmetric channel

characteristic of weakly symmetric channel

The columns of its transition matrix P can be partitionedinto subsets Ci such that , for each i, in the matrix Pi formedby the columns in Ci each row is a permutation of every otherrow, and the same is true of columns.

§2.1 Channel capacity

( ) 1/ 1 2( ) | ( , , ..., )P x r sC H Y H p p p

2

2)1(

pp

pp

20

02)2(

pp

pp

1 pp

problem

§2.1 Channel capacity

3. Symmetric channel

C=?

4. Discrete memoryless extended channel

N

iii

xPxP

N YXIYXIC1

)()();(max);(max

CC i

NCC N

§2.1 Channel capacity

Review

• KeyWords:

Channel capacity

Symmetric channel(Strongly symmetric channel, Weakly symmetric channel)

Simple DMC

Homework

1. P71: T2.19(a) ;

2. Calculate the channel capacity , and find the maximizing probability distribution.

1 0

0 1

0 1

p p

p p

p p

1/ 2 1/ 3 1/ 6

1/ 6 1/ 2 1/ 3

1/ 3 1/ 6 1/ 2

0.9 0.1 0

0 0.1 0.9P

Homework

3. Channel capacity. Consider the discrete memoryless channel Y = X+Z (mod 11), where

1 2 3

1/ 3 1/ 3 1/ 3Z

and X {0,1, ...,10}

Assume that Z is independent of X.(a)Find the capacity.(b)What is the maximizing p*(x)?

Homework

4. Find the capacity of the noisy typewriter channel.

(the channel input is either received unchanged at the output with probability ½, or is transformed into the next letter with probability ½ )

§2 Discrete memoryless channels and their capacity funtion

§2.1 The capacity function

§2.2 The channel coding theorem

§2.1 The capacity function

§2.2 The channel coding theorem

§2.2 The channel coding theorem

1. The concept of channel coding

Source Encoder

Channel

Sink Decoder

M C

RM’

General digital communication system

Example 2.2.1 Repeating code

Extended channel

000

111

000001010100011110101111

§2.2 The channel coding theorem

1. The concept of channel coding

000

111

Example 2.2.2 Block code(5,2)

),,,,(54321 iiiiii aaaaa

215

14

213

iii

ii

iii

aaa

aa

aaa

§2.2 The channel coding theorem

1. The concept of channel coding

§2.2 The channel coding theorem

Example 2.2.2 Block code(5,2)

1. The concept of channel coding

3 1 2

4 1

5 1 2

i i i

i i

i i i

a a a

a a

a a a

2. The channel coding theorem

Theorem 2.1 (the channel coding theorem for DMC’s) .

For any R < C and , for all sufficiently large

n there exists a code [C] = {x1,…,xM} of length n

and a decoding rule such that:

0

RnM 21) ,

2) PE< .

§2.2 The channel coding theorem

(p62 corollary in textbook)

2. The channel coding theorem

Statement 2 (The channel coding theorem):

All rates below capacity C are achievable. Specifically,

for every rate R ≤ C, there exists a sequence of (2nR,n) codes

with maximum probability of error .0EP

Conversely, any sequence of (2nR, n) codes with must have R ≤ C.

0EP

§2.2 The channel coding theorem

thinking

The repeating code (2n+1,1), using MLD decoder.Show that its average error probability is

12

1

12)1(12n

nk

knkE pp

k

nP

p is the error probability of BSC, compute PE whenp = 0.01, n = 1,2,3,4.

Home work

Recommended