35
Sum Rate of Gaussian Multiterminal Source Coding Pramod Viswanath University of Illinois, Urbana-Champaign March 19, 2003

Sum Rate of Gaussian Multiterminal Source Coding

  • Upload
    others

  • View
    7

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Sum Rate of Gaussian Multiterminal Source Coding

Sum Rate of Gaussian Multiterminal Source

Coding

Pramod Viswanath

University of Illinois, Urbana-Champaign

March 19, 2003

Page 2: Sum Rate of Gaussian Multiterminal Source Coding

Gaussian Multiterminal Source Coding

ENCODER 2

ENCODER 1

ENCODER K

E

DECOD

R

yK = htKn

n2

n1

nN

n =n

y2 = ht2n

y1 = ht1n

mK

m2

m1

Sum of rates of encoders R and distortion metric d (n, n)

Page 3: Sum Rate of Gaussian Multiterminal Source Coding

Quadratic Gaussian CEO Problem

ENCODER K

ENCODER 1

ENCODER 2

ER

DOCED

y1 = n + z1

n n

yK = n + zK

x2 = n + z2

mK

m2

m1

Sum of rates of encoders R and distortion metric d (n, n)

Page 4: Sum Rate of Gaussian Multiterminal Source Coding

Result: Quadratic Gaussian CEO

• quadratic distortion metric d (n, n) = (n− n)2

• For large number of encoders K,

R(D) =1

2log+

(σ2

n

D

)+

σ2z

2σ2n

(σ2

n

D− 1

)+

– Second term is loss w.r.t. cooperating encoders

Page 5: Sum Rate of Gaussian Multiterminal Source Coding

Outline

• Problem Formulation:

Tradeoff between sum rate R and distortion

(metric d (n, n)).

• Main Result:

Characterize a class of distortion metrics for which no loss

in sum rate compared with encoder cooperation

– A multiple antenna test channel

Page 6: Sum Rate of Gaussian Multiterminal Source Coding

Random Binning of Slepian-Wolf

O

××

×O

O

O

×

∆∆

×

××

O

O

O

y1

y2

×

×

O

O

×

• Rate is number of quantizers

Page 7: Sum Rate of Gaussian Multiterminal Source Coding

Encoding in Slepian-Wolf

O

××

×O

O

O

×

∆∆

×

××

O

O

O

y1

y2

×

×

O

O

×

• Quantizer closest to realization

Page 8: Sum Rate of Gaussian Multiterminal Source Coding

Decoding in Slepian-Wolf

• Decoder knows joint distribution of y1, y2

• It is given the two quantizer numbers from the encoders

• Picks the pair of points in the quantizers which best matches

the joint distribution

– For jointly Gaussian y1, y2 nearest neighbor type test

• R1 + R2 = H (y1, y2) is sufficient for zero distortion

Page 9: Sum Rate of Gaussian Multiterminal Source Coding

Deterministic Broadcast Channel

y2 = g(x)

x

y1 = f(x)

• Pick distribution on x such that y1, y2 have desired joint dis-

tribtion (Cover 98)

Page 10: Sum Rate of Gaussian Multiterminal Source Coding

Slepian-Wolf code for Broadcast Channel

• Encoding: implement Slepian-Wolf decoder

– given two messages, find the appropriate pair y1, y2 in the

two quantizers

– transmit x that generates this pair y1, y2.

Page 11: Sum Rate of Gaussian Multiterminal Source Coding

Slepian-Wolf code for Broadcast Channel

• Encoding: implement Slepian-Wolf decoder

– given two messages, find the appropriate pair y1, y2 in the

two quantizers

– transmit x that generates this pair y1, y2.

• Decoding: implement Slepian-Wolf encoder

– quantize y1, y2 to nearest point

– messages are the quantizer numbers

Page 12: Sum Rate of Gaussian Multiterminal Source Coding

Lossy Slepian-Wolf Source Coding

O

××

×O

O

O

×

∆∆

×

××

O

O

O

u1

u2

×

×

O

O

×

• Approximate y1, y2 by u1, u2

Page 13: Sum Rate of Gaussian Multiterminal Source Coding

Lossy Slepian-Wolf Source Coding

• Encoding: Find ui that matches source yi, separately for

each i

– For jointly Gaussian r.v. s, nearest neighbor calculation

– Each encoder sends quantizer number containing the u

picked

Page 14: Sum Rate of Gaussian Multiterminal Source Coding

Lossy Slepian-Wolf Source Coding

• Encoding: Find ui that matches source yi, separately foreach i

– For jointly Gaussian r.v. s, nearest neighbor calculation

– Each encoder sends quantizer number containing the u

picked

• Decoding: Reconstruct the u’s picked by the encoders

– reconstruction based on joint distribution of u’s

– Previously ui = yi were correlated

– Here u’s are independently picked

Page 15: Sum Rate of Gaussian Multiterminal Source Coding

Lossy Slepian-Wolf

• We require

p [u1, . . . , uK|y1, . . . , yK] = ΠKi=1p [ui|yi]

Page 16: Sum Rate of Gaussian Multiterminal Source Coding

Lossy Slepian-Wolf

• We require

p [u1, . . . , uK|y1, . . . , yK] = ΠKi=1p [ui|yi]

• Generate n1, . . . , nK deterministically from reconstructed u’s.

• Need sum rate

Rsum = I (u1, . . . , uK; y1, . . . , yK) .

• Distortion equal to

E [d (n, n)] .

Page 17: Sum Rate of Gaussian Multiterminal Source Coding

Marton Coding for Broadcast Channel

Dec1

Dec2

w ∼ N (0, I)

Hty

u1

u2

xu1, u2

Enc

• Reversed encoding and decoding operations

• Sum rate I (u1, . . . , uK; y1, . . . , yK) .

• No use for the Markov property

p [u1, . . . , uK|y1, . . . , yK] = ΠKi=1p [ui|yi]

Page 18: Sum Rate of Gaussian Multiterminal Source Coding

Achievable Rates: Costa Precoding

m1 u1

w

xHt

m2 u2

m1

m2

• Users’ data modulated onto spatial signatures u1,u2

Page 19: Sum Rate of Gaussian Multiterminal Source Coding

Stage 1: Costa Precoding

Dec1

m1 u1

w

xHt

m2 u2

m1

• Encoding for user 1 treating signal from user 2 as known

interference at transmitter

Page 20: Sum Rate of Gaussian Multiterminal Source Coding

Stage 2

Dec2

m1 u1

w

xHt

m2 u2

m2

• Encode user 2 treating signal for user 1 as noise

Page 21: Sum Rate of Gaussian Multiterminal Source Coding

Adaptation to Lossy Slepian-Wolf

DecHt

z ∼ N (0, Kz)

xu1, u2

Ency

u1

u2

• Joint distribution of u’s and y’s depends on noise z

• Performance independent of correlation in z

Page 22: Sum Rate of Gaussian Multiterminal Source Coding

Noise Coloring

• Fix particular Costa coding scheme - fixes u’s and x.

• Idea:

Choose z such that

p [u1, . . . , uK|y1, . . . , yK] = ΠKi=1p [ui|yi]

and (Kz)ii = 1

• Then can adapt to Lossy Multiterminal Source Coding

Page 23: Sum Rate of Gaussian Multiterminal Source Coding

Markov Condition and Broadcast Channel

• The Markov condition

p [u1, . . . , uK|y1, . . . , yK] = ΠKi=1p [ui|yi]

of independent interest in the broadcast channel

Page 24: Sum Rate of Gaussian Multiterminal Source Coding

Markov Condition and Broadcast Channel

• The Markov condition

p [u1, . . . , uK|y1, . . . , yK] = ΠKi=1p [ui|yi]

of independent interest in the broadcast channel

p [u1, . . . , uK|y1, . . . , yK] = ΠKi=1p

[ui|y1, . . . , yK, u1, u2, . . . , ui−1

]

Page 25: Sum Rate of Gaussian Multiterminal Source Coding

Markov Condition and Broadcast Channel

• The Markov condition

p [u1, . . . , uK|y1, . . . , yK] = ΠKi=1p [ui|yi]

of independent interest in the broadcast channel

p [u1, . . . , uK|y1, . . . , yK] = ΠKi=1p

[ui|y1, . . . , yK, u1, u2, . . . , ui−1

]

• Equivalent to: given u1, . . . , ui−1

ui −→ yi −→ y1, . . . , yi−1, yi+1, . . . , yK

Page 26: Sum Rate of Gaussian Multiterminal Source Coding

Implication

DecHt

z ∼ N (0, Kz)

xu1, u2

Ency

u1

u2

• Need only y1 to decode u1

• Given u1, need only y2 to decode u2

Performance of Costa scheme equals that when receivers

cooperate

Page 27: Sum Rate of Gaussian Multiterminal Source Coding

Markov Condition and Noise Covariance

• The sum capacity is also achieved by such a scheme

(CS 01, YC 01, VT 02, VJG 02)

• For every Costa scheme, there is a choice of Kz such that

Markov condition holds (Yu and Cioffi, 01)

Page 28: Sum Rate of Gaussian Multiterminal Source Coding

Sum Capacity

z ∼ N (0, Kz)ReceiversCooperating

Ht

H

Ht

H

Sato

Enc

w

wBroadcast

DecEnc

DecEnc

w

E[xtKzx

] ≤ P

ReciprocityReciprocity

x1, x2 independent

Multiple Access Cooperating Transmitters

Page 29: Sum Rate of Gaussian Multiterminal Source Coding

Sum Capacity

z ∼ N (0, Kz)ReceiversCooperating

Ht

H

Ht

H

Sato

Enc

w

wBroadcast

DecEnc

DecEnc

w

E[xtKzx

] ≤ P

Reciprocity

Convex Duality

Reciprocity

x1, x2 independent

Multiple Access Cooperating Transmitters

Page 30: Sum Rate of Gaussian Multiterminal Source Coding

Gaussian Multiterminal Source Coding

ENCODER 2

ENCODER 1

ENCODER K

E

DECOD

R

yK = htKn

n2

n1

nN

n =n

y2 = ht2n

y1 = ht1n

mK

m2

m1

H = [h1, . . . ,hK]

Sum of rates of encoders R and distortion metric d (n, n)

Page 31: Sum Rate of Gaussian Multiterminal Source Coding

Main Result

• Distortion metric

d (n; n) =1

N(n− n)t

(I + Hdiag {p1, . . . , pK}Ht

)(n− n)

– Here p1, . . . , pK - powers of users in reciprocal MAC

• Rate distortion function

R(D) = Sum rate of MAC−N logD

= Sum rate of Broadcast Channel−N logD

= logdet(I + HDHt

)−N logD

Page 32: Sum Rate of Gaussian Multiterminal Source Coding

Bells and Whistles

• For quadratic distortion metric

d (n; n) =1

N(n− n)t (n− n)

set of H can be characterized

• Analogy with CEO problem:

For large number of encoders and random H

characterization of R(D) almost surely

Page 33: Sum Rate of Gaussian Multiterminal Source Coding

Discussion

• A “connection” made between coding schemes for multiter-

minal source and channel coding

Page 34: Sum Rate of Gaussian Multiterminal Source Coding

Discussion

• A “connection” made between coding schemes for multiter-

minal source and channel coding

• Connection somewhat superficial

– relation between source coding and broadcast channel

through a common random coding argument (PR 02, CC

02)

– relation between source coding and multiple access chan-

nel through a change of variable (VT 02, JVG 01)

Page 35: Sum Rate of Gaussian Multiterminal Source Coding

Discussion

• A “connection” made between coding schemes for multiter-minal source and channel coding

• Connection somewhat superficial

– relation between source coding and broadcast channelthrough a common random coding argument

– relation between source coding and multiple access chan-nel through a change of variable

• Connection is suggestive

– a codebook level duality