37
MMSE Interference in Gaussian Channels 1 Shlomo Shamai Department of Electrical Engineering Technion - Israel Institute of Technology 2012 Information Theory and Applications Workshop 5-10 February, San Diego 1 The talk is based on recent studies done jointly with Ronit Bustin Shlomo Shamai ITA, February 2012 1/26

[IEEE 2012 Information Theory and Applications Workshop (ITA) - San Diego, CA, USA (2012.02.5-2012.02.10)] 2012 Information Theory and Applications Workshop - MMSE interference in

  • Upload
    shlomo

  • View
    213

  • Download
    1

Embed Size (px)

Citation preview

Page 1: [IEEE 2012 Information Theory and Applications Workshop (ITA) - San Diego, CA, USA (2012.02.5-2012.02.10)] 2012 Information Theory and Applications Workshop - MMSE interference in

MMSE Interference in Gaussian Channels1

Shlomo Shamai

Department of Electrical Engineering Technion - Israel Institute of Technology

2012 Information Theory and Applications Workshop5-10 February, San Diego

1The talk is based on recent studies done jointly with Ronit Bustin

Shlomo Shamai ITA, February 2012 1/26

Page 2: [IEEE 2012 Information Theory and Applications Workshop (ITA) - San Diego, CA, USA (2012.02.5-2012.02.10)] 2012 Information Theory and Applications Workshop - MMSE interference in

The Framework

The Scalar Additive Gaussian Channel

Our framework is the scalar additive Gaussian channel:

Y =√snrX + N

where N ∼ N (0, 1). Through which we transmit length ncodewords.

Constraint

We limit our investigation to power constrained codes:

∀x ∈ Cn1

n

n�

i=1

x2i ≤ 1

Shlomo Shamai ITA, February 2012 2/26

Page 3: [IEEE 2012 Information Theory and Applications Workshop (ITA) - San Diego, CA, USA (2012.02.5-2012.02.10)] 2012 Information Theory and Applications Workshop - MMSE interference in

The Framework

The Scalar Additive Gaussian Channel

Our framework is the scalar additive Gaussian channel:

Y =√snrX + N

where N ∼ N (0, 1). Through which we transmit length ncodewords.

Constraint

We limit our investigation to power constrained codes:

∀x ∈ Cn1

n

n�

i=1

x2i ≤ 1

Shlomo Shamai ITA, February 2012 2/26

Page 4: [IEEE 2012 Information Theory and Applications Workshop (ITA) - San Diego, CA, USA (2012.02.5-2012.02.10)] 2012 Information Theory and Applications Workshop - MMSE interference in

Optimal Point-to-Point Codes

Theorem [Peleg, Sanderovich and Shamai, ETT 2007]

For every capacity achieving code-sequence, Cn, over the Gaussianchannel, the mutual information, when n → ∞, is as follows:

I (X ;√γX + N) =

�1

2log(1 + γ), γ ≤ snr

1

2log(1 + snr), o/w

and the MMSE is:

MMSEc(γ) =

�1

1+γ , γ ≤ snr

0, o/w

Shlomo Shamai ITA, February 2012 3/26

Page 5: [IEEE 2012 Information Theory and Applications Workshop (ITA) - San Diego, CA, USA (2012.02.5-2012.02.10)] 2012 Information Theory and Applications Workshop - MMSE interference in

Optimal Point-to-Point Codes - Cont.

The mutual information (and MMSE) of optimal point-to-pointcodes follow the behavior of an i.i.d. Gaussian input up to snr.

0 0.5 1 1.5 2 2.5 3 3.50

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

mutu

al i

nfo

rmatio

n \

MM

SE

Iopt()

MMSEopt()

Student Version of MATLAB

Shlomo Shamai ITA, February 2012 4/26

Page 6: [IEEE 2012 Information Theory and Applications Workshop (ITA) - San Diego, CA, USA (2012.02.5-2012.02.10)] 2012 Information Theory and Applications Workshop - MMSE interference in

What is the effect on an unintended receiver?

X Y

Z

R

?

Assumption: the unintended receiver, Z , has smaller snr, that is,snrz < snr.How should we measure the effect (disturbance)?For optimal point-to-point codes both the mutual information andMMSE are completely known.But what about non-optimal code (that do not attain capacity)?

Shlomo Shamai ITA, February 2012 5/26

Page 7: [IEEE 2012 Information Theory and Applications Workshop (ITA) - San Diego, CA, USA (2012.02.5-2012.02.10)] 2012 Information Theory and Applications Workshop - MMSE interference in

How to measure the disturbance?

Bandemer and El Gamal, 2011: measure the disturbance atthe unintended receiver using the mutual information at Z .That is, assuming this mutual information is at most Rd whatis the maximum possible rate to the intended receiver, Y .

In this work we measure the disturbance at the unintendedreceiver using the MMSE of the input, X , at Z . That is,assuming the MMSE is constrained to be at most β

1+βsnrzwhat is the maximum possible rate to the intended receiver,Y .

Shlomo Shamai ITA, February 2012 6/26

Page 8: [IEEE 2012 Information Theory and Applications Workshop (ITA) - San Diego, CA, USA (2012.02.5-2012.02.10)] 2012 Information Theory and Applications Workshop - MMSE interference in

How to measure the disturbance?

Bandemer and El Gamal, 2011: measure the disturbance atthe unintended receiver using the mutual information at Z .That is, assuming this mutual information is at most Rd whatis the maximum possible rate to the intended receiver, Y .

In this work we measure the disturbance at the unintendedreceiver using the MMSE of the input, X , at Z . That is,assuming the MMSE is constrained to be at most β

1+βsnrzwhat is the maximum possible rate to the intended receiver,Y .

Shlomo Shamai ITA, February 2012 6/26

Page 9: [IEEE 2012 Information Theory and Applications Workshop (ITA) - San Diego, CA, USA (2012.02.5-2012.02.10)] 2012 Information Theory and Applications Workshop - MMSE interference in

Definitions

In(γ) =1

nI (X;Y(γ))

I (γ) = limn→∞

1

nI (X;Y(γ))

MMSEcn(γ) =1

nTr(EX(γ))

MMSEc(γ) = limn→∞

MMSEcn(γ)

where EX(γ) is the MMSE matrix of estimating X fromY(γ) =

√γX+N.

Shlomo Shamai ITA, February 2012 7/26

Page 10: [IEEE 2012 Information Theory and Applications Workshop (ITA) - San Diego, CA, USA (2012.02.5-2012.02.10)] 2012 Information Theory and Applications Workshop - MMSE interference in

The I-MMSE approach

1. The I-MMSE relationship [Guo, Shamai and Verdu, IT 2005]

A fundamental relationship between the mutual information andthe MMSE in the Gaussian channel:

In(γ) =1

2

�snr

0

MMSEcn(γ)dγ

Taking the limit of n → ∞:

I (γ) =1

2

�snr

0

MMSEc(γ)dγ

Shlomo Shamai ITA, February 2012 8/26

Page 11: [IEEE 2012 Information Theory and Applications Workshop (ITA) - San Diego, CA, USA (2012.02.5-2012.02.10)] 2012 Information Theory and Applications Workshop - MMSE interference in

The I-MMSE approach

2. The “single crossing point” property

The property was originally derived for the scalar case in[Guo, Wu, Shamai and Verdu, IT 2011].

Several MIMO extensions are given in[Bustin, Payaro, Palomar and Shamai <arXiv > ].

We require the simplest extension.

Define the following function for an arbitrary random vector X:

qA(X,σ2, γ) =

σ2

1 + σ2γTr (A)− Tr (AEX(γ))

where A is some n × n general weighting matrix.

Shlomo Shamai ITA, February 2012 9/26

Page 12: [IEEE 2012 Information Theory and Applications Workshop (ITA) - San Diego, CA, USA (2012.02.5-2012.02.10)] 2012 Information Theory and Applications Workshop - MMSE interference in

The I-MMSE approach

2. The “single crossing point” property

The property was originally derived for the scalar case in[Guo, Wu, Shamai and Verdu, IT 2011].

Several MIMO extensions are given in[Bustin, Payaro, Palomar and Shamai <arXiv > ].

We require the simplest extension.

Define the following function for an arbitrary random vector X:

qA(X,σ2, γ) =

σ2

1 + σ2γTr (A)− Tr (AEX(γ))

where A is some n × n general weighting matrix.

Shlomo Shamai ITA, February 2012 9/26

Page 13: [IEEE 2012 Information Theory and Applications Workshop (ITA) - San Diego, CA, USA (2012.02.5-2012.02.10)] 2012 Information Theory and Applications Workshop - MMSE interference in

The I-MMSE approach - Cont.

Theorem [Bustin, Payaro, Palomar and Shamai <arXiv > ]

Let A ∈ Sn+be a PSD matrix. Then, the function qA(X,σ2, γ), has no

nonnegative-to-negative zero crossings and, at most, a singlenegative-to-nonnegative zero crossing in the range γ ∈ [0,∞). Moreover,let snr0 ∈ [0,∞) be that crossing point. Then,

1 qA(X,σ2, 0) ≤ 0.

2 qA(X,σ2, γ) is a strictly increasing in γ ∈ [0, snr0).

3 qA(X,σ2, γ) ≥ 0 for all γ ∈ [snr0,∞).

4 limγ→∞ qA(X,σ2, γ) = 0.

In this work we set A = I, the identity matrix:

1

nqI(X,σ

2, γ) =σ2

1 + σ2γ− 1

nTr (AEX(γ)) = mmseG (γ)−MMSEcn(γ)

where mmseG (γ) assumes an independent Gaussian input ∼ N (0,σ2).

Shlomo Shamai ITA, February 2012 10/26

Page 14: [IEEE 2012 Information Theory and Applications Workshop (ITA) - San Diego, CA, USA (2012.02.5-2012.02.10)] 2012 Information Theory and Applications Workshop - MMSE interference in

The I-MMSE approach - Cont.

Theorem [Bustin, Payaro, Palomar and Shamai <arXiv > ]

Let A ∈ Sn+be a PSD matrix. Then, the function qA(X,σ2, γ), has no

nonnegative-to-negative zero crossings and, at most, a singlenegative-to-nonnegative zero crossing in the range γ ∈ [0,∞). Moreover,let snr0 ∈ [0,∞) be that crossing point. Then,

1 qA(X,σ2, 0) ≤ 0.

2 qA(X,σ2, γ) is a strictly increasing in γ ∈ [0, snr0).

3 qA(X,σ2, γ) ≥ 0 for all γ ∈ [snr0,∞).

4 limγ→∞ qA(X,σ2, γ) = 0.

In this work we set A = I, the identity matrix:

1

nqI(X,σ

2, γ) =σ2

1 + σ2γ− 1

nTr (AEX(γ)) = mmseG (γ)−MMSEcn(γ)

where mmseG (γ) assumes an independent Gaussian input ∼ N (0,σ2).

Shlomo Shamai ITA, February 2012 10/26

Page 15: [IEEE 2012 Information Theory and Applications Workshop (ITA) - San Diego, CA, USA (2012.02.5-2012.02.10)] 2012 Information Theory and Applications Workshop - MMSE interference in

Superposition codes

I (γ) and MMSEc(γ) - known exactly [Merhav, Guo and Shamai, IT 2010]:

A superposition codebook designed for (snr1, snr2) with therate-splitting coefficient β < 1.

I (γ) =

1

2log (1 + γ) , if 0 ≤ γ < snr1

1

2log

�1+snr1

1+βsnr1

�+ 1

2log (1 + βγ) , if snr1 ≤ γ ≤ snr2

1

2log

�1+snr1

1+βsnr1

�+ 1

2log (1 + βsnr2) , if snr2 < γ

MMSEc(γ) =

1

1+γ , 0 ≤ γ < snr1β

1+βγ , snr1 ≤ γ ≤ snr20, snr2 < γ

Shlomo Shamai ITA, February 2012 11/26

Page 16: [IEEE 2012 Information Theory and Applications Workshop (ITA) - San Diego, CA, USA (2012.02.5-2012.02.10)] 2012 Information Theory and Applications Workshop - MMSE interference in

Superposition codes - Cont.

Example:

0 0.5 1 1.5 2 2.5 3 3.50

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

mutu

al i

nfo

rmatio

n \ M

MS

E

MMSEc

MMSEopt

Iopt()

I()

snr1 = 2

snr2 = 2.5

= 0.4

Student Version of MATLAB

Shlomo Shamai ITA, February 2012 12/26

Page 17: [IEEE 2012 Information Theory and Applications Workshop (ITA) - San Diego, CA, USA (2012.02.5-2012.02.10)] 2012 Information Theory and Applications Workshop - MMSE interference in

Main Result

Theorem 1

Assuming snr1 < snr2 the solution of the following optimizationproblem,

max I (snr2)

s.t. MMSEc(snr1) ≤β

1 + βsnr1

for some β ∈ [0, 1], is the following

I (snr2) =1

2log (1 + βsnr2) +

1

2log

�1 + snr11 + βsnr1

and is attainable when using the optimal Gaussian superpositioncodebook designed for (snr1, snr2) with a rate-splitting coefficientβ.

Shlomo Shamai ITA, February 2012 13/26

Page 18: [IEEE 2012 Information Theory and Applications Workshop (ITA) - San Diego, CA, USA (2012.02.5-2012.02.10)] 2012 Information Theory and Applications Workshop - MMSE interference in

Proof Sketch

Optimal Gaussian superposition codebook comply with theabove MMSE constraint and attain the maximum rate.

We need a tight upper bound on the rate.

We prove an equivalent claim: assume a code of rateRc = 1

2log (1 + αsnr2), designed for reliable transmission at

snr2, lower bound MMSEc(γ). Then specify for γ = snr1.

αsnr2 ≤ γ ≤ 1

The lower bound is trivially zero using the optimal Gaussiancodebook designed for αsnr2. This is equivalent to setting β = 0.

γ ≤ αsnr2

I (snr2)− I (γ) ≥ I (snr2)−1

2log (1 + γ) = Rc −

1

2log (1 + γ)

Shlomo Shamai ITA, February 2012 14/26

Page 19: [IEEE 2012 Information Theory and Applications Workshop (ITA) - San Diego, CA, USA (2012.02.5-2012.02.10)] 2012 Information Theory and Applications Workshop - MMSE interference in

Proof Sketch

Optimal Gaussian superposition codebook comply with theabove MMSE constraint and attain the maximum rate.

We need a tight upper bound on the rate.

We prove an equivalent claim: assume a code of rateRc = 1

2log (1 + αsnr2), designed for reliable transmission at

snr2, lower bound MMSEc(γ). Then specify for γ = snr1.

αsnr2 ≤ γ ≤ 1

The lower bound is trivially zero using the optimal Gaussiancodebook designed for αsnr2. This is equivalent to setting β = 0.

γ ≤ αsnr2

I (snr2)− I (γ) ≥ I (snr2)−1

2log (1 + γ) = Rc −

1

2log (1 + γ)

Shlomo Shamai ITA, February 2012 14/26

Page 20: [IEEE 2012 Information Theory and Applications Workshop (ITA) - San Diego, CA, USA (2012.02.5-2012.02.10)] 2012 Information Theory and Applications Workshop - MMSE interference in

Proof Sketch

Optimal Gaussian superposition codebook comply with theabove MMSE constraint and attain the maximum rate.

We need a tight upper bound on the rate.

We prove an equivalent claim: assume a code of rateRc = 1

2log (1 + αsnr2), designed for reliable transmission at

snr2, lower bound MMSEc(γ). Then specify for γ = snr1.

αsnr2 ≤ γ ≤ 1

The lower bound is trivially zero using the optimal Gaussiancodebook designed for αsnr2. This is equivalent to setting β = 0.

γ ≤ αsnr2

I (snr2)− I (γ) ≥ I (snr2)−1

2log (1 + γ) = Rc −

1

2log (1 + γ)

Shlomo Shamai ITA, February 2012 14/26

Page 21: [IEEE 2012 Information Theory and Applications Workshop (ITA) - San Diego, CA, USA (2012.02.5-2012.02.10)] 2012 Information Theory and Applications Workshop - MMSE interference in

Proof Sketch - Cont.

Using the I-MMSE relationship:

1

2

�snr2

γMMSEc(τ) dτ ≥ 1

2log (1 + αsnr2)−

1

2log (1 + γ)

Defining d through the following equality:

1

2log (1 + αsnr2) −

1

2log (1 + γ) =

1

2log (1 + dsnr2) −

1

2log (1 + dγ)

we have:

1

2

�snr2

γMMSEc(τ) dτ ≥ 1

2log (1 + αsnr2)−

1

2log (1 + γ)

=1

2log (1 + dsnr2)−

1

2log (1 + dγ)

=1

2

�snr2

γmmseG (τ) dτ, XG ∼ N (0, d), i .i .d .

Shlomo Shamai ITA, February 2012 15/26

Page 22: [IEEE 2012 Information Theory and Applications Workshop (ITA) - San Diego, CA, USA (2012.02.5-2012.02.10)] 2012 Information Theory and Applications Workshop - MMSE interference in

Proof Sketch - Cont.

Using the I-MMSE relationship:

1

2

�snr2

γMMSEc(τ) dτ ≥ 1

2log (1 + αsnr2)−

1

2log (1 + γ)

Defining d through the following equality:

1

2log (1 + αsnr2) −

1

2log (1 + γ) =

1

2log (1 + dsnr2) −

1

2log (1 + dγ)

we have:

1

2

�snr2

γMMSEc(τ) dτ ≥ 1

2log (1 + αsnr2)−

1

2log (1 + γ)

=1

2log (1 + dsnr2)−

1

2log (1 + dγ)

=1

2

�snr2

γmmseG (τ) dτ, XG ∼ N (0, d), i .i .d .

Shlomo Shamai ITA, February 2012 15/26

Page 23: [IEEE 2012 Information Theory and Applications Workshop (ITA) - San Diego, CA, USA (2012.02.5-2012.02.10)] 2012 Information Theory and Applications Workshop - MMSE interference in

Proof Sketch - Cont.

Using the “single crossing point” property and the above inequalitywe can conclude:The single crossing point of mmseG (τ) and MMSEc(τ), if exists,will occur somewhere in the region (γ,∞).Thus, we have the following lower bound:

MMSEc(γ) ≥ d(γ)

1 + d(γ)γ=

αsnr2 − γ

snr2 − γ

1

1 + γ

Specifically for γ = snr1 we obtain

MMSEc(snr1) ≥αsnr2 − snr1snr2 − snr1

1

1 + snr1.

Deriving α as a function of the constraining β, and substituting itin Rc = 1

2log (1 + αsnr2) results with the superposition rate.

Shlomo Shamai ITA, February 2012 16/26

Page 24: [IEEE 2012 Information Theory and Applications Workshop (ITA) - San Diego, CA, USA (2012.02.5-2012.02.10)] 2012 Information Theory and Applications Workshop - MMSE interference in

Proof Sketch - Cont.

Using the “single crossing point” property and the above inequalitywe can conclude:The single crossing point of mmseG (τ) and MMSEc(τ), if exists,will occur somewhere in the region (γ,∞).Thus, we have the following lower bound:

MMSEc(γ) ≥ d(γ)

1 + d(γ)γ=

αsnr2 − γ

snr2 − γ

1

1 + γ

Specifically for γ = snr1 we obtain

MMSEc(snr1) ≥αsnr2 − snr1snr2 − snr1

1

1 + snr1.

Deriving α as a function of the constraining β, and substituting itin Rc = 1

2log (1 + αsnr2) results with the superposition rate.

Shlomo Shamai ITA, February 2012 16/26

Page 25: [IEEE 2012 Information Theory and Applications Workshop (ITA) - San Diego, CA, USA (2012.02.5-2012.02.10)] 2012 Information Theory and Applications Workshop - MMSE interference in

The effect at other snrs

Theorem 2

From the set of reliable codes of rate

Rc =1

2log (1 + βsnr2) +

1

2log

�1 + snr11 + βsnr1

complying with the MMSE constraint at snr1:

MMSEc(snr1) ≤β

1 + βsnr1

the superposition codebook provides the minimum MMSE for allsnrs.

Shlomo Shamai ITA, February 2012 17/26

Page 26: [IEEE 2012 Information Theory and Applications Workshop (ITA) - San Diego, CA, USA (2012.02.5-2012.02.10)] 2012 Information Theory and Applications Workshop - MMSE interference in

Extension to two MMSE constraints

Theorem 3 [Bustin and Shamai, IZS 2012]

Assuming snr0 < snr1 < snr2 the solution of,

max I (snr2)

s.t. MMSEc(snr1) ≤β1

1 + β1snr1, MMSEc(snr0) ≤

β01 + β0snr0

for some positive β1,β0 such that β1 + β0 ≤ 1 and β1 < β0, is

I (snr2) =1

2log

�(1 + β1snr2)

1 + β0snr11 + β1snr1

1 + snr01 + β0snr0

and is attainable when using the optimal three-layers Gaussiansuperposition codebook designed for (snr0, snr1, snr2) withrate-splitting coefficients (β0,β1).When β0 < β1 the first constraint can be removed and we returnto the case of a single constraint given in Theorem 1.

Shlomo Shamai ITA, February 2012 18/26

Page 27: [IEEE 2012 Information Theory and Applications Workshop (ITA) - San Diego, CA, USA (2012.02.5-2012.02.10)] 2012 Information Theory and Applications Workshop - MMSE interference in

Proof Sketch

Optimal Gaussian three-layers superposition codebook complywith the above MMSE constraints and attain the maximumrate.

We need a tight upper bound on the rate.

Using Theorem 1

Considering only the constraint on MMSEc(snr0) we obtain thefollowing upper bound on the rate at snr1:

I (snr1) ≤1

2log (1 + β0snr1) +

1

2log

�1 + snr01 + β0snr0

Shlomo Shamai ITA, February 2012 19/26

Page 28: [IEEE 2012 Information Theory and Applications Workshop (ITA) - San Diego, CA, USA (2012.02.5-2012.02.10)] 2012 Information Theory and Applications Workshop - MMSE interference in

Proof Sketch

Optimal Gaussian three-layers superposition codebook complywith the above MMSE constraints and attain the maximumrate.

We need a tight upper bound on the rate.

Using Theorem 1

Considering only the constraint on MMSEc(snr0) we obtain thefollowing upper bound on the rate at snr1:

I (snr1) ≤1

2log (1 + β0snr1) +

1

2log

�1 + snr01 + β0snr0

Shlomo Shamai ITA, February 2012 19/26

Page 29: [IEEE 2012 Information Theory and Applications Workshop (ITA) - San Diego, CA, USA (2012.02.5-2012.02.10)] 2012 Information Theory and Applications Workshop - MMSE interference in

Proof Sketch - Cont.

The I-MMSE approach

I (snr2)− I (snr1) =1

2

�snr2

snr1

MMSEc(τ)dτ ≤ 1

2

�snr2

snr1

mmseG (τ)dτ

where XG ∼ N (0,β1) and i.i.d. This is valid since according to theconstraint on MMSEc(snr1) we have

MMSEc(snr1) ≤β1

1 + β1snr1= mmseG (snr1)

and according to the single crossing point property

MMSEc(τ) ≤ mmseG (τ), ∀τ ≥ snr1

Putting the two upper bounds together, we obtain the desiredresult.

Shlomo Shamai ITA, February 2012 20/26

Page 30: [IEEE 2012 Information Theory and Applications Workshop (ITA) - San Diego, CA, USA (2012.02.5-2012.02.10)] 2012 Information Theory and Applications Workshop - MMSE interference in

Proof Sketch - Cont.

The I-MMSE approach

I (snr2)− I (snr1) =1

2

�snr2

snr1

MMSEc(τ)dτ ≤ 1

2

�snr2

snr1

mmseG (τ)dτ

where XG ∼ N (0,β1) and i.i.d. This is valid since according to theconstraint on MMSEc(snr1) we have

MMSEc(snr1) ≤β1

1 + β1snr1= mmseG (snr1)

and according to the single crossing point property

MMSEc(τ) ≤ mmseG (τ), ∀τ ≥ snr1

Putting the two upper bounds together, we obtain the desiredresult.

Shlomo Shamai ITA, February 2012 20/26

Page 31: [IEEE 2012 Information Theory and Applications Workshop (ITA) - San Diego, CA, USA (2012.02.5-2012.02.10)] 2012 Information Theory and Applications Workshop - MMSE interference in

Mutual information disturbance: single constraint

Theorem 4 [Bandemer and El Gamal, ISIT 2011]

Assuming snr1 < snr2 the solution of the following optimizationproblem,

max In(snr2)

s.t. In(snr1) ≤1

2log (1 + α�snr1)

for some α� ∈ [0, 1], is the following

In(snr2) =1

2log (1 + α�snr2) .

Equality is attained, for any n, by choosing X Gaussian with i.i.d.components of variance α�. For n → ∞ equality is also attained bya Gaussian codebook designed for snr2 with limited power of α�.

Shlomo Shamai ITA, February 2012 21/26

Page 32: [IEEE 2012 Information Theory and Applications Workshop (ITA) - San Diego, CA, USA (2012.02.5-2012.02.10)] 2012 Information Theory and Applications Workshop - MMSE interference in

Mutual information disturbance: single constraint

Alternative I-MMSE proof

Since, 0 ≤ In(snr1) ≤ 1

2log (1 + snr1) there exists an α� ∈ [0, 1] such that

In(snr1) =1

nlog (1 + α�snr1) .

=⇒ MMSEcn(γ) and mmseG (γ) of XG ∼ N (0,α�) cross in [0, snr1].

Using the I-MMSE

In(snr2) =1

2log (1 + α∗snr1) +

�snr2

snr1

MMSEcn(γ)dγ

≤ 1

2log (1 + α∗snr2)

due to the “single crossing point” property which ensures

MMSEcn(γ) ≤ mmseG (γ), ∀γ ∈ [snr1,∞)

Shlomo Shamai ITA, February 2012 22/26

Page 33: [IEEE 2012 Information Theory and Applications Workshop (ITA) - San Diego, CA, USA (2012.02.5-2012.02.10)] 2012 Information Theory and Applications Workshop - MMSE interference in

Mutual information disturbance: single constraint

Alternative I-MMSE proof

Since, 0 ≤ In(snr1) ≤ 1

2log (1 + snr1) there exists an α� ∈ [0, 1] such that

In(snr1) =1

nlog (1 + α�snr1) .

=⇒ MMSEcn(γ) and mmseG (γ) of XG ∼ N (0,α�) cross in [0, snr1].

Using the I-MMSE

In(snr2) =1

2log (1 + α∗snr1) +

�snr2

snr1

MMSEcn(γ)dγ

≤ 1

2log (1 + α∗snr2)

due to the “single crossing point” property which ensures

MMSEcn(γ) ≤ mmseG (γ), ∀γ ∈ [snr1,∞)

Shlomo Shamai ITA, February 2012 22/26

Page 34: [IEEE 2012 Information Theory and Applications Workshop (ITA) - San Diego, CA, USA (2012.02.5-2012.02.10)] 2012 Information Theory and Applications Workshop - MMSE interference in

Mutual information disturbance: multiple constraints

Theorem 5

Assuming snr1 < snr2 < · · · < snrK the solution of

max In(snrK )

s.t. ∀i ∈ {1, · · · ,K − 1}, In(snri ) ≤1

2log (1 + αi snri )

for some αi ∈ [0, 1], is the following

In(snrK ) =1

2log (1 + α�snrK )

where α�, � ∈ {1, · · · ,K − 1}, is defined such that

∀i ∈ {1, · · · ,K − 1} 1

2log (1 + α�snri ) ≤

1

2log (1 + αi snri )

The maximum rate is attained, for any n, by choosing X Gaussian withi.i.d. components of variance α�. For n → ∞ equality is also attained bya Gaussian codebook designed for snrK with limited power of α�.

Shlomo Shamai ITA, February 2012 23/26

Page 35: [IEEE 2012 Information Theory and Applications Workshop (ITA) - San Diego, CA, USA (2012.02.5-2012.02.10)] 2012 Information Theory and Applications Workshop - MMSE interference in

Summary and Outlook

These results provide the engineering insight to the goodperformance of the Han and Kobayashi scheme on theinterference channel. We show that the Han and Kobayashischeme is optimal MMSE-wise.

The results can be easily extended to K MMSE constraint[Bustin and Shamai, submitted ISIT 2012].

The engineering advantage of the MMSE disturbance measureover the mutual information measure in the Gaussian channelare demonstrated.

Single code I-MMSE tradeoff, bounded by the optimalsuperposition coding tradeoff[Bennatan, Shamai, Calderbank, <arXiv:1008.1766v1-2010>].

Interesting challenges: optimization of In(snr2), under theMMSEcn(snr1) constraint when block-length n is finite. Forn = 1, conjecture: the optimal X is discrete [Shamai, ISIT 2011].

Shlomo Shamai ITA, February 2012 24/26

Page 36: [IEEE 2012 Information Theory and Applications Workshop (ITA) - San Diego, CA, USA (2012.02.5-2012.02.10)] 2012 Information Theory and Applications Workshop - MMSE interference in

Thank You!

Shlomo Shamai ITA, February 2012 25/26

Page 37: [IEEE 2012 Information Theory and Applications Workshop (ITA) - San Diego, CA, USA (2012.02.5-2012.02.10)] 2012 Information Theory and Applications Workshop - MMSE interference in

“MMSE interference in Gaussian Channels”Ronit Bustin and Shlomo Shamai

We consider the scalar Gaussian channel, and address the problem of maximizing the

average mutual information of a power constraint n component (n → ∞) input

random vector at a given signal-to-noise ratio (snr), satisfying a minimum mean

square error (MMSE) constraint at another lower snr value. We use the MMSE as an

effective interference (disturbance) measure, motivated by interference networks,

where codes are expected not only to optimize performance for the intended user but

inflict minimum interference on other users. We show via the information-estimation

relation, that superposition coding is optimal in this respect, providing further

intuition to the effectiveness of the Han-Kobayashi coding strategy on the interference

channel, and performance of ’bad’ codes.

Moreover, the MMSE function of those codes, attaining the best rate at some snr,

subjected to a prescribed MMSE demand at some other snr, is completely defined for

all snr, and is the one obtained by the corresponding superposition codebooks.

Extensions to two MMSE constraints, are discussed, and compared to the results for a

mutual information disturbance measure. Some challenges for this class of interference

problems will also be discussed.

Shlomo Shamai ITA, February 2012 26/26