160
Lectures on Network Information Theory Abbas El Gamal Stanford University Allerton 2009 A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 1 / 42

Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

  • Upload
    others

  • View
    1

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

Lectures on

Network Information Theory

Abbas El Gamal

Stanford University

Allerton 2009

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 1 / 42

Page 2: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

The Early YearsI started a course on multiple user (network) information theory atStanford in 1982 and taught it 3 times

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 2 / 42

Page 3: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

The Early YearsI started a course on multiple user (network) information theory atStanford in 1982 and taught it 3 timesThe course had some of today’s big names in our field:

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 2 / 42

Page 4: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

Syllabus Circa 1983

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 3 / 42

Page 5: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

Syllabus Circa 1983

I also gave a lecture on feedback

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 4 / 42

Page 6: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

Syllabus Circa 1983

I also gave a lecture on feedback

Some results that were known then and are considered importanttoday were absent:

Interference channel: Strong interference; Han–Kobayashi

Relay channel: cutset bound; decode–forward; compress–forward

Multiple descriptions: El Gamal–Cover; Ozarow; Ahlswede

Secrecy: Shannon; Wyner; Csiszar–Korner

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 4 / 42

Page 7: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

Syllabus Circa 1983

I also gave a lecture on feedback

Some results that were known then and are considered importanttoday were absent:

Interference channel: Strong interference; Han–Kobayashi

Relay channel: cutset bound; decode–forward; compress–forward

Multiple descriptions: El Gamal–Cover; Ozarow; Ahlswede

Secrecy: Shannon; Wyner; Csiszar–Korner

There was no theoretical or practical interest in these results then

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 4 / 42

Page 8: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

The Dog Years of NIT

By the mid 80s interest in NIT was all but gone

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 5 / 42

Page 9: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

The Dog Years of NIT

By the mid 80s interest in NIT was all but gone

Theory was stuck and many basic problems remained open

It seemed that the theory will have no applications

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 5 / 42

Page 10: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

The Dog Years of NIT

By the mid 80s interest in NIT was all but gone

Theory was stuck and many basic problems remained open

It seemed that the theory will have no applications

By early 90s, the number of ISIT papers on NIT → 0:

#ofPapers

Year

1979

1981

1982

1983

1985

1986

1988

1990

1991

1993

10

20

30

40

00

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 5 / 42

Page 11: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

The Dog Years of NIT

By the mid 80s interest in NIT was all but gone

Theory was stuck and many basic problems remained open

It seemed that the theory will have no applications

By early 90s, the number of ISIT papers on NIT → 0:

#ofPapers

Year

1979

1981

1982

1983

1985

1986

1988

1990

1991

1993

10

20

30

40

00

I stopped teaching the course and moved on to other things

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 5 / 42

Page 12: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

The ResurgenceBy late 90s, the Internet and wireless communication began to reviveinterest in NIT; and by early 2000s, the field was in full swing

#ofPapers

Year

1979

1981

1982

1983

1985

1986

1988

1990

1991

1993

2002

2004

10

20

30

40

00

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 6 / 42

Page 13: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

The ResurgenceBy late 90s, the Internet and wireless communication began to reviveinterest in NIT; and by early 2000s, the field was in full swingI started teaching the course again in 2002

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 6 / 42

Page 14: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

The ResurgenceBy late 90s, the Internet and wireless communication began to reviveinterest in NIT; and by early 2000s, the field was in full swingI started teaching the course again in 2002The course had some of today’s rising stars:

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 6 / 42

Page 15: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

Syllabus Circa 2002

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 7 / 42

Page 16: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

Technology from Early 80s to 2002

Chip technology: Scaled by a factor of 211 (Moore’s law)

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 8 / 42

Page 17: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

Technology from Early 80s to 2002

Chip technology: Scaled by a factor of 211 (Moore’s law)

Computing: From VAX780 to PCs and laptops

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 8 / 42

Page 18: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

Technology from Early 80s to 2002

Chip technology: Scaled by a factor of 211 (Moore’s law)

Computing: From VAX780 to PCs and laptops

Communication: From 1200 Baud modems and wired phones to DSL,cellular, and 802.xx

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 8 / 42

Page 19: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

Technology from Early 80s to 2002

Chip technology: Scaled by a factor of 211 (Moore’s law)

Computing: From VAX780 to PCs and laptops

Communication: From 1200 Baud modems and wired phones to DSL,cellular, and 802.xx

Networks: From ARPANET to the Internet

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 8 / 42

Page 20: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

Technology from Early 80s to 2002

Chip technology: Scaled by a factor of 211 (Moore’s law)

Computing: From VAX780 to PCs and laptops

Communication: From 1200 Baud modems and wired phones to DSL,cellular, and 802.xx

Networks: From ARPANET to the Internet

Multi-media: From film cameras and Sony Walkman to digitalcameras and iPod

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 8 / 42

Page 21: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

What’s Wrong with This Picture?

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 9 / 42

Page 22: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

What’s Wrong with This Picture?

Theory does not advance as fast as technology

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 9 / 42

Page 23: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

What’s Wrong with This Picture?

Theory does not advance as fast as technology

Nothing happened between early 80s and 2002

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 9 / 42

Page 24: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

What’s Wrong with This Picture?

Theory does not advance as fast as technology

Nothing happened between early 80s and 2002

I didn’t know what was going on

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 9 / 42

Page 25: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

What’s Wrong with This Picture?

Theory does not advance as fast as technology

Nothing happened between early 80s and 2002

I didn’t know what was going on

Answer: All of the above

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 9 / 42

Page 26: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

What Happened Since Mid 80s?

Some progress on old open problems (mainly Gaussian)

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 10 / 42

Page 27: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

What Happened Since Mid 80s?

Some progress on old open problems (mainly Gaussian)

Work on new models: Fading channels; MIMO; secrecy, . . .

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 10 / 42

Page 28: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

What Happened Since Mid 80s?

Some progress on old open problems (mainly Gaussian)

Work on new models: Fading channels; MIMO; secrecy, . . .

New directions in network capacity:

Network coding

Scaling laws

Deterministic/high SNR approximations (within xx bits)

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 10 / 42

Page 29: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

What Happened Since Mid 80s?

Some progress on old open problems (mainly Gaussian)

Work on new models: Fading channels; MIMO; secrecy, . . .

New directions in network capacity:

Network coding

Scaling laws

Deterministic/high SNR approximations (within xx bits)

Attempts to consummate marriage (or at least dating) between ITand networking

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 10 / 42

Page 30: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

Lectures on NIT: 2009

Developed jointly with Young-Han Kim of UCSD

Incorporate many of the recent results

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 11 / 42

Page 31: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

Lectures on NIT: 2009

Developed jointly with Young-Han Kim of UCSD

Incorporate many of the recent results

Attempt to organize the field in a “top-down” way

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 11 / 42

Page 32: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

Lectures on NIT: 2009

Developed jointly with Young-Han Kim of UCSD

Incorporate many of the recent results

Attempt to organize the field in a “top-down” way

Balance introduction of new techniques and new models

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 11 / 42

Page 33: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

Lectures on NIT: 2009

Developed jointly with Young-Han Kim of UCSD

Incorporate many of the recent results

Attempt to organize the field in a “top-down” way

Balance introduction of new techniques and new models

Unify, simplify, and formalize achievability proofs

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 11 / 42

Page 34: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

Lectures on NIT: 2009

Developed jointly with Young-Han Kim of UCSD

Incorporate many of the recent results

Attempt to organize the field in a “top-down” way

Balance introduction of new techniques and new models

Unify, simplify, and formalize achievability proofs

Emphasize extension to networks

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 11 / 42

Page 35: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

Lectures on NIT: 2009

Developed jointly with Young-Han Kim of UCSD

Incorporate many of the recent results

Attempt to organize the field in a “top-down” way

Balance introduction of new techniques and new models

Unify, simplify, and formalize achievability proofs

Emphasize extension to networks

Use clean and unified notation and terminology

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 11 / 42

Page 36: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

Outline

1 The First Lecture

2 Achievability for DM Sources and Channels

3 Gaussian Sources and Channels

4 Converse

5 Extension to Networks

6 Conclusion

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 12 / 42

Page 37: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

The First Lecture

Network Information Flow

Consider a general networked information processing system:

replacements

Network

Source

Node

Sources: data, speech, music, images, video, sensor data

Nodes: handsets, base stations, servers, sensor nodes

Network: wired, wireless, or hybrid

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 13 / 42

Page 38: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

The First Lecture

Network Information Flow

Each node observes some sources, wishes to obtain descriptions ofother sources, or to compute function/make decision based on them

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 14 / 42

Page 39: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

The First Lecture

Network Information Flow

Each node observes some sources, wishes to obtain descriptions ofother sources, or to compute function/make decision based on them

To achieve the goal, the nodes communicate and perform localcomputing

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 14 / 42

Page 40: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

The First Lecture

Network Information Flow

Each node observes some sources, wishes to obtain descriptions ofother sources, or to compute function/make decision based on them

To achieve the goal, the nodes communicate and perform localcomputing

Information flow questions:

What are the necessary and sufficient conditions on information flow?

What are the optimal schemes/techniques needed to achieve them?

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 14 / 42

Page 41: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

The First Lecture

Network Information Flow

Each node observes some sources, wishes to obtain descriptions ofother sources, or to compute function/make decision based on them

To achieve the goal, the nodes communicate and perform localcomputing

Information flow questions:

What are the necessary and sufficient conditions on information flow?

What are the optimal schemes/techniques needed to achieve them?

The difficulty in answering these questions depends on:◮ Source and network models◮ Information processing goals◮ Computational capabilities of the nodes

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 14 / 42

Page 42: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

The First Lecture

Example: Multi-Commodity Flow

If the sources are commodities with demands (rates in bits/sec); thenodes are connected by noiseless rate-constrained links; eachintermediate node forwards the bits it receives; the goal is to sendeach commodity to a destination node; the problem reduces to themulti-commodity flow with known conditions on optimal flow

1

2

3

4

j

k

C12

C13

C14N

M1

M2

M3

M1

M3

M2

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 15 / 42

Page 43: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

The First Lecture

Example: Multi-Commodity Flow

If the sources are commodities with demands (rates in bits/sec); thenodes are connected by noiseless rate-constrained links; eachintermediate node forwards the bits it receives; the goal is to sendeach commodity to a destination node; the problem reduces to themulti-commodity flow with known conditions on optimal flow

1

2

3

4

j

k

C12

C13

C14N

M1

M2

M3

M1

M3

M2

For single commodity, these conditions reduce to the celebratedmax-flow min-cut theorem

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 15 / 42

Page 44: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

The First Lecture

Network Information Theory

This simple networked information processing system model does notcapture many important aspects of real-world systems:

◮ Real-world information sources have redundancies, time and spacecorrelations, time variations

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 16 / 42

Page 45: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

The First Lecture

Network Information Theory

This simple networked information processing system model does notcapture many important aspects of real-world systems:

◮ Real-world information sources have redundancies, time and spacecorrelations, time variations

◮ Real-world networks may suffer from noise, interference, node/linkfailures, delay, time variation

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 16 / 42

Page 46: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

The First Lecture

Network Information Theory

This simple networked information processing system model does notcapture many important aspects of real-world systems:

◮ Real-world information sources have redundancies, time and spacecorrelations, time variations

◮ Real-world networks may suffer from noise, interference, node/linkfailures, delay, time variation

◮ Real-world networks may allow for broadcasting

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 16 / 42

Page 47: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

The First Lecture

Network Information Theory

This simple networked information processing system model does notcapture many important aspects of real-world systems:

◮ Real-world information sources have redundancies, time and spacecorrelations, time variations

◮ Real-world networks may suffer from noise, interference, node/linkfailures, delay, time variation

◮ Real-world networks may allow for broadcasting

◮ Real-world communication nodes may allow for more complex nodeoperations than forwarding

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 16 / 42

Page 48: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

The First Lecture

Network Information Theory

This simple networked information processing system model does notcapture many important aspects of real-world systems:

◮ Real-world information sources have redundancies, time and spacecorrelations, time variations

◮ Real-world networks may suffer from noise, interference, node/linkfailures, delay, time variation

◮ Real-world networks may allow for broadcasting

◮ Real-world communication nodes may allow for more complex nodeoperations than forwarding

◮ The goal in many information processing systems is to partially recoverthe sources or to compute/make a decision

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 16 / 42

Page 49: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

The First Lecture

Network Information Theory

This simple networked information processing system model does notcapture many important aspects of real-world systems:

◮ Real-world information sources have redundancies, time and spacecorrelations, time variations

◮ Real-world networks may suffer from noise, interference, node/linkfailures, delay, time variation

◮ Real-world networks may allow for broadcasting

◮ Real-world communication nodes may allow for more complex nodeoperations than forwarding

◮ The goal in many information processing systems is to partially recoverthe sources or to compute/make a decision

Network information theory aims to answer the information flowquestions while capturing essential elements of real-world networks inthe probabilistic framework of Shannon’s information theory

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 16 / 42

Page 50: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

The First Lecture

State of the Theory

Focus has been on compression and communication for discretememoryless (DM) and Gaussian sources and channels

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 17 / 42

Page 51: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

The First Lecture

State of the Theory

Focus has been on compression and communication for discretememoryless (DM) and Gaussian sources and channels

Most results are for separate source–channel settings

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 17 / 42

Page 52: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

The First Lecture

State of the Theory

Focus has been on compression and communication for discretememoryless (DM) and Gaussian sources and channels

Most results are for separate source–channel settings

Computable characterizations of capacity/optimal rate regions knownfor few cases. For other cases, only inner and outer bounds are known

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 17 / 42

Page 53: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

The First Lecture

State of the Theory

Focus has been on compression and communication for discretememoryless (DM) and Gaussian sources and channels

Most results are for separate source–channel settings

Computable characterizations of capacity/optimal rate regions knownfor few cases. For other cases, only inner and outer bounds are known

Some results on joint source–channel coding, communication forcomputing, secrecy, and in intersection with networking

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 17 / 42

Page 54: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

The First Lecture

State of the Theory

Focus has been on compression and communication for discretememoryless (DM) and Gaussian sources and channels

Most results are for separate source–channel settings

Computable characterizations of capacity/optimal rate regions knownfor few cases. For other cases, only inner and outer bounds are known

Some results on joint source–channel coding, communication forcomputing, secrecy, and in intersection with networking

Coding techniques developed, e.g., superposition, successivecancellation, Slepian–Wolf, Wyner–Ziv, successive refinement, dirtypaper coding, network coding are starting to impact real-worldnetworks

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 17 / 42

Page 55: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

The First Lecture

State of the Theory

Focus has been on compression and communication for discretememoryless (DM) and Gaussian sources and channels

Most results are for separate source–channel settings

Computable characterizations of capacity/optimal rate regions knownfor few cases. For other cases, only inner and outer bounds are known

Some results on joint source–channel coding, communication forcomputing, secrecy, and in intersection with networking

Coding techniques developed, e.g., superposition, successivecancellation, Slepian–Wolf, Wyner–Ziv, successive refinement, dirtypaper coding, network coding are starting to impact real-worldnetworks

However, many basic problems remain open and a complete theory isyet to be developed

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 17 / 42

Page 56: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

The First Lecture

Outline of Lectures

Lectures aim to provide broad coverage of the models, fundamentalresults, proof techniques, and open problems in NIT

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 18 / 42

Page 57: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

The First Lecture

Outline of Lectures

Lectures aim to provide broad coverage of the models, fundamentalresults, proof techniques, and open problems in NIT

Include both teaching material and advanced results

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 18 / 42

Page 58: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

The First Lecture

Outline of Lectures

Lectures aim to provide broad coverage of the models, fundamentalresults, proof techniques, and open problems in NIT

Include both teaching material and advanced results

Divided into four parts:

Part I: Background

Part II: Single-hop Networks

Part III: Multi-hop Networks

Part IV: Extensions

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 18 / 42

Page 59: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

The First Lecture

Outline of Lectures

Lectures aim to provide broad coverage of the models, fundamentalresults, proof techniques, and open problems in NIT

Include both teaching material and advanced results

Divided into four parts:

Part I: Background

Part II: Single-hop Networks

Part III: Multi-hop Networks

Part IV: Extensions

Global appendices for general techniques and background, e.g.,bounding cardinalities of auxiliary random variables andFourier–Motzkin elimination

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 18 / 42

Page 60: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

The First Lecture

Part: I Background

Purpose: Introduce notation and basic techniques used throughout;point out some differences between point-to-point and multiple usercommunication

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 19 / 42

Page 61: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

The First Lecture

Part: I Background

Purpose: Introduce notation and basic techniques used throughout;point out some differences between point-to-point and multiple usercommunication

Entropy, differential entropy, and mutual information

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 19 / 42

Page 62: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

The First Lecture

Part: I Background

Purpose: Introduce notation and basic techniques used throughout;point out some differences between point-to-point and multiple usercommunication

Entropy, differential entropy, and mutual information

Strong typicality: Orlitsky–Roche definition; properties

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 19 / 42

Page 63: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

The First Lecture

Part: I Background

Purpose: Introduce notation and basic techniques used throughout;point out some differences between point-to-point and multiple usercommunication

Entropy, differential entropy, and mutual information

Strong typicality: Orlitsky–Roche definition; properties

Key achievability lemmas:◮ Typical average lemma◮ Joint typicality lemma◮ Packing lemma◮ Covering lemma◮ Conditional typicality lemma

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 19 / 42

Page 64: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

The First Lecture

Part: I Background

Purpose: Introduce notation and basic techniques used throughout;point out some differences between point-to-point and multiple usercommunication

Entropy, differential entropy, and mutual information

Strong typicality: Orlitsky–Roche definition; properties

Key achievability lemmas:◮ Typical average lemma◮ Joint typicality lemma◮ Packing lemma◮ Covering lemma◮ Conditional typicality lemma

Shannon’s point-to-point communication theorems: Random coding;joint typicality encoding/decoding

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 19 / 42

Page 65: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

The First Lecture

Part II: Single-hop Networks

Single round one-way communication

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 20 / 42

Page 66: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

The First Lecture

Part II: Single-hop Networks

Single round one-way communication

Independent messages over noisy channels:

Correlated sources over noiseless (wireline) channels:

Correlated sources over DM channels:

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 20 / 42

Page 67: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

The First Lecture

Part II: Single-hop Networks

Single round one-way communication

Independent messages over noisy channels:◮ Multiple access channels: time sharing; successive cancellation

Correlated sources over noiseless (wireline) channels:

Correlated sources over DM channels:

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 20 / 42

Page 68: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

The First Lecture

Part II: Single-hop Networks

Single round one-way communication

Independent messages over noisy channels:◮ Multiple access channels: time sharing; successive cancellation◮ Degraded broadcast channels: superposition coding

Correlated sources over noiseless (wireline) channels:

Correlated sources over DM channels:

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 20 / 42

Page 69: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

The First Lecture

Part II: Single-hop Networks

Single round one-way communication

Independent messages over noisy channels:◮ Multiple access channels: time sharing; successive cancellation◮ Degraded broadcast channels: superposition coding◮ Interference channels: strong interference; Han–Kobayashi

Correlated sources over noiseless (wireline) channels:

Correlated sources over DM channels:

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 20 / 42

Page 70: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

The First Lecture

Part II: Single-hop Networks

Single round one-way communication

Independent messages over noisy channels:◮ Multiple access channels: time sharing; successive cancellation◮ Degraded broadcast channels: superposition coding◮ Interference channels: strong interference; Han–Kobayashi◮ Channels with state: Gelfand–Pinsker; writing on dirty paper

Correlated sources over noiseless (wireline) channels:

Correlated sources over DM channels:

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 20 / 42

Page 71: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

The First Lecture

Part II: Single-hop Networks

Single round one-way communication

Independent messages over noisy channels:◮ Multiple access channels: time sharing; successive cancellation◮ Degraded broadcast channels: superposition coding◮ Interference channels: strong interference; Han–Kobayashi◮ Channels with state: Gelfand–Pinsker; writing on dirty paper◮ Fading channels: alternative performance measures (outage capacity)

Correlated sources over noiseless (wireline) channels:

Correlated sources over DM channels:

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 20 / 42

Page 72: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

The First Lecture

Part II: Single-hop Networks

Single round one-way communication

Independent messages over noisy channels:◮ Multiple access channels: time sharing; successive cancellation◮ Degraded broadcast channels: superposition coding◮ Interference channels: strong interference; Han–Kobayashi◮ Channels with state: Gelfand–Pinsker; writing on dirty paper◮ Fading channels: alternative performance measures (outage capacity)◮ General broadcast channels: Marton coding; mutual covering

Correlated sources over noiseless (wireline) channels:

Correlated sources over DM channels:

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 20 / 42

Page 73: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

The First Lecture

Part II: Single-hop Networks

Single round one-way communication

Independent messages over noisy channels:◮ Multiple access channels: time sharing; successive cancellation◮ Degraded broadcast channels: superposition coding◮ Interference channels: strong interference; Han–Kobayashi◮ Channels with state: Gelfand–Pinsker; writing on dirty paper◮ Fading channels: alternative performance measures (outage capacity)◮ General broadcast channels: Marton coding; mutual covering◮ Vector Gaussian channels: dirty paper coding; MAC–BC duality

Correlated sources over noiseless (wireline) channels:

Correlated sources over DM channels:

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 20 / 42

Page 74: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

The First Lecture

Part II: Single-hop Networks

Single round one-way communication

Independent messages over noisy channels:◮ Multiple access channels: time sharing; successive cancellation◮ Degraded broadcast channels: superposition coding◮ Interference channels: strong interference; Han–Kobayashi◮ Channels with state: Gelfand–Pinsker; writing on dirty paper◮ Fading channels: alternative performance measures (outage capacity)◮ General broadcast channels: Marton coding; mutual covering◮ Vector Gaussian channels: dirty paper coding; MAC–BC duality

Correlated sources over noiseless (wireline) channels:◮ Distributed lossless source coding: Slepian–Wolf; random binning

Correlated sources over DM channels:

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 20 / 42

Page 75: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

The First Lecture

Part II: Single-hop Networks

Single round one-way communication

Independent messages over noisy channels:◮ Multiple access channels: time sharing; successive cancellation◮ Degraded broadcast channels: superposition coding◮ Interference channels: strong interference; Han–Kobayashi◮ Channels with state: Gelfand–Pinsker; writing on dirty paper◮ Fading channels: alternative performance measures (outage capacity)◮ General broadcast channels: Marton coding; mutual covering◮ Vector Gaussian channels: dirty paper coding; MAC–BC duality

Correlated sources over noiseless (wireline) channels:◮ Distributed lossless source coding: Slepian–Wolf; random binning◮ Source coding with side information: Wyner–Ziv

Correlated sources over DM channels:

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 20 / 42

Page 76: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

The First Lecture

Part II: Single-hop Networks

Single round one-way communication

Independent messages over noisy channels:◮ Multiple access channels: time sharing; successive cancellation◮ Degraded broadcast channels: superposition coding◮ Interference channels: strong interference; Han–Kobayashi◮ Channels with state: Gelfand–Pinsker; writing on dirty paper◮ Fading channels: alternative performance measures (outage capacity)◮ General broadcast channels: Marton coding; mutual covering◮ Vector Gaussian channels: dirty paper coding; MAC–BC duality

Correlated sources over noiseless (wireline) channels:◮ Distributed lossless source coding: Slepian–Wolf; random binning◮ Source coding with side information: Wyner–Ziv◮ Distributed lossy source coding: Berger–Tung; quadratic Gaussian

Correlated sources over DM channels:

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 20 / 42

Page 77: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

The First Lecture

Part II: Single-hop Networks

Single round one-way communication

Independent messages over noisy channels:◮ Multiple access channels: time sharing; successive cancellation◮ Degraded broadcast channels: superposition coding◮ Interference channels: strong interference; Han–Kobayashi◮ Channels with state: Gelfand–Pinsker; writing on dirty paper◮ Fading channels: alternative performance measures (outage capacity)◮ General broadcast channels: Marton coding; mutual covering◮ Vector Gaussian channels: dirty paper coding; MAC–BC duality

Correlated sources over noiseless (wireline) channels:◮ Distributed lossless source coding: Slepian–Wolf; random binning◮ Source coding with side information: Wyner–Ziv◮ Distributed lossy source coding: Berger–Tung; quadratic Gaussian◮ Multiple descriptions: El Gamal–Cover; successive refinement

Correlated sources over DM channels:

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 20 / 42

Page 78: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

The First Lecture

Part II: Single-hop Networks

Single round one-way communication

Independent messages over noisy channels:◮ Multiple access channels: time sharing; successive cancellation◮ Degraded broadcast channels: superposition coding◮ Interference channels: strong interference; Han–Kobayashi◮ Channels with state: Gelfand–Pinsker; writing on dirty paper◮ Fading channels: alternative performance measures (outage capacity)◮ General broadcast channels: Marton coding; mutual covering◮ Vector Gaussian channels: dirty paper coding; MAC–BC duality

Correlated sources over noiseless (wireline) channels:◮ Distributed lossless source coding: Slepian–Wolf; random binning◮ Source coding with side information: Wyner–Ziv◮ Distributed lossy source coding: Berger–Tung; quadratic Gaussian◮ Multiple descriptions: El Gamal–Cover; successive refinement

Correlated sources over DM channels:Separation does not hold in general; common information; sufficientconditions for MAC, BC

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 20 / 42

Page 79: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

The First Lecture

Part III: Multi-hop Networks

Relaying and multiple communication rounds

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 21 / 42

Page 80: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

The First Lecture

Part III: Multi-hop Networks

Relaying and multiple communication rounds

Independent messages over noiseless networks:

Independent messages over noisy networks:

Correlated sources over noiseless (wireline) channels:

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 21 / 42

Page 81: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

The First Lecture

Part III: Multi-hop Networks

Relaying and multiple communication rounds

Independent messages over noiseless networks:

Max-flow min-cut theorem; network coding

Independent messages over noisy networks:

Correlated sources over noiseless (wireline) channels:

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 21 / 42

Page 82: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

The First Lecture

Part III: Multi-hop Networks

Relaying and multiple communication rounds

Independent messages over noiseless networks:

Max-flow min-cut theorem; network coding

Independent messages over noisy networks:

◮ Relay channel: cutset bound; decode–forward; compress–forward

Correlated sources over noiseless (wireline) channels:

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 21 / 42

Page 83: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

The First Lecture

Part III: Multi-hop Networks

Relaying and multiple communication rounds

Independent messages over noiseless networks:

Max-flow min-cut theorem; network coding

Independent messages over noisy networks:

◮ Relay channel: cutset bound; decode–forward; compress–forward◮ Interactive communication: feedback capacity; iterative refinement

Correlated sources over noiseless (wireline) channels:

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 21 / 42

Page 84: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

The First Lecture

Part III: Multi-hop Networks

Relaying and multiple communication rounds

Independent messages over noiseless networks:

Max-flow min-cut theorem; network coding

Independent messages over noisy networks:

◮ Relay channel: cutset bound; decode–forward; compress–forward◮ Interactive communication: feedback capacity; iterative refinement◮ DM networks: cutset bound; decode–forward; compress–forward

Correlated sources over noiseless (wireline) channels:

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 21 / 42

Page 85: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

The First Lecture

Part III: Multi-hop Networks

Relaying and multiple communication rounds

Independent messages over noiseless networks:

Max-flow min-cut theorem; network coding

Independent messages over noisy networks:

◮ Relay channel: cutset bound; decode–forward; compress–forward◮ Interactive communication: feedback capacity; iterative refinement◮ DM networks: cutset bound; decode–forward; compress–forward◮ Gaussian networks: scaling laws; high SNR approximations

Correlated sources over noiseless (wireline) channels:

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 21 / 42

Page 86: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

The First Lecture

Part III: Multi-hop Networks

Relaying and multiple communication rounds

Independent messages over noiseless networks:

Max-flow min-cut theorem; network coding

Independent messages over noisy networks:

◮ Relay channel: cutset bound; decode–forward; compress–forward◮ Interactive communication: feedback capacity; iterative refinement◮ DM networks: cutset bound; decode–forward; compress–forward◮ Gaussian networks: scaling laws; high SNR approximations

Correlated sources over noiseless (wireline) channels:

Multiple descriptions networks; interactive source coding

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 21 / 42

Page 87: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

The First Lecture

Part IV: Extensions

Extensions of the theory to other settings

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 22 / 42

Page 88: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

The First Lecture

Part IV: Extensions

Extensions of the theory to other settings

Communication for computing:

Distributed coding for computing: Orlitsky–Roche; µ-sum problem;distributed consensus

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 22 / 42

Page 89: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

The First Lecture

Part IV: Extensions

Extensions of the theory to other settings

Communication for computing:

Distributed coding for computing: Orlitsky–Roche; µ-sum problem;distributed consensus

Information theoretic secrecy:

Wiretap channels; key generation from common randomness

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 22 / 42

Page 90: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

The First Lecture

Part IV: Extensions

Extensions of the theory to other settings

Communication for computing:

Distributed coding for computing: Orlitsky–Roche; µ-sum problem;distributed consensus

Information theoretic secrecy:

Wiretap channels; key generation from common randomness

Asynchronous communication:

Random arrivals; asynchronous MAC

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 22 / 42

Page 91: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

The First Lecture

Balancing Introduction of Models and Techniques:

Broadcast Channel

Degraded broadcast channels:

Channels with state:

Fading channels

General broadcast channels:

Gaussian vector channels:

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 23 / 42

Page 92: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

The First Lecture

Balancing Introduction of Models and Techniques:

Broadcast Channel

Degraded broadcast channels:◮ Superposition coding inner bound◮ Degraded broadcast channels◮ AWGN broadcast channels◮ Less noisy and more capable broadcast channels

Channels with state:

Fading channels

General broadcast channels:

Gaussian vector channels:

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 23 / 42

Page 93: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

The First Lecture

Balancing Introduction of Models and Techniques:

Broadcast Channel

Degraded broadcast channels:

Channels with state:◮ Compound channel◮ Arbitrarily varying channel◮ Channels with random state◮ Causal state information available at encoder◮ Noncausal state information available at the encoder◮ Writing on dirty paper◮ Partial state information

Fading channels

General broadcast channels:

Gaussian vector channels:

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 23 / 42

Page 94: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

The First Lecture

Balancing Introduction of Models and Techniques:

Broadcast Channel

Degraded broadcast channels:

Channels with state:

Fading channels

General broadcast channels:◮ DM-BC with degraded message sets◮ 3-Receiver multilevel DM-BC with degraded message sets◮ Marton inner bound◮ Relationship to Gelfand–Pinsker◮ Nair–El Gamal outer bound◮ Inner bound for more than 2 receivers

Gaussian vector channels:

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 23 / 42

Page 95: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

The First Lecture

Balancing Introduction of Models and Techniques:

Broadcast Channel

Degraded broadcast channels:

Channels with state:

Fading channels

General broadcast channels:

Gaussian vector channels:◮ Gaussian vector channel◮ Gaussian vector fading channel◮ Gaussian vector multiple access channel◮ Spectral Gaussian broadcast channel◮ Vector writing on dirty paper◮ Gaussian vector broadcast channel

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 23 / 42

Page 96: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

Achievability for DM Sources and Channels

Typicality

Let (un, xn, yn) be a triple of sequences with elements drawn fromfinite alphabets (U ,X ,Y). Define their joint type as

π(u, x, y|un, xn, yn) =|{i : (ui, xi, yi) = (u, x, y)}|

n

for (u, x, y) ∈ U × X × Y

Let (U,X, Y ) ∼ p(u, x, y). The set T(n)ǫ (U,X, Y ) of ǫ-typical

n-sequences is defined as

{(un, xn, yn) : |π(u, x, y|un, xn, yn)− p(u, x, y)| ≤ ǫ · p(u, x, y)

for all (u, x, y) ∈ U × X × Y}

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 24 / 42

Page 97: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

Achievability for DM Sources and Channels

Typicality

Let (un, xn, yn) be a triple of sequences with elements drawn fromfinite alphabets (U ,X ,Y). Define their joint type as

π(u, x, y|un, xn, yn) =|{i : (ui, xi, yi) = (u, x, y)}|

n

for (u, x, y) ∈ U × X × Y

Let (U,X, Y ) ∼ p(u, x, y). The set T(n)ǫ (U,X, Y ) of ǫ-typical

n-sequences is defined as

{(un, xn, yn) : |π(u, x, y|un, xn, yn)− p(u, x, y)| ≤ ǫ · p(u, x, y)

for all (u, x, y) ∈ U × X × Y}

Typical average lemma: Let xn ∈ T(n)ǫ (X). Then for any g(x) ≥ 0,

(1− ǫ)E(g(X)) ≤ (1/n)∑n

i=1 g(xi) ≤ (1 + ǫ)E(g(X))

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 24 / 42

Page 98: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

Achievability for DM Sources and Channels

Joint Typicality Lemma

Let (U,X, Y ) ∼ p(u, x, y).

1. Let (un, xn) ∈ T(n)ǫ (U,X) and Y n ∼

∏n

i=1 pY |U (yi|ui). Then

P{(un, xn, Y n) ∈ T(n)ǫ (U,X, Y )}

.= 2−nI(X;Y |U)

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 25 / 42

Page 99: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

Achievability for DM Sources and Channels

Joint Typicality Lemma

Let (U,X, Y ) ∼ p(u, x, y).

1. Let (un, xn) ∈ T(n)ǫ (U,X) and Y n ∼

∏n

i=1 pY |U (yi|ui). Then

P{(un, xn, Y n) ∈ T(n)ǫ (U,X, Y )}

.= 2−nI(X;Y |U)

2. If (Un, Xn) ∼ p(un, xn) and Y n ∼∏n

i=1 pY |U (yi|ui). Then

P{(Un, Xn, Y n) ∈ T(n)ǫ (U,X, Y )} ≤ 2−n(I(X;Y |U)−δ(ǫ))

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 25 / 42

Page 100: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

Achievability for DM Sources and Channels

Packing Lemma

Let (U,X, Y ) ∼ p(u, x, y) and Un ∼ p(un). Let Xn(m), m ∈ A, where|A| ≤ 2nR, be random sequences, each distributed according to∏n

i=1 pX|U (xi|ui) with arbitrary dependence on the rest

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 26 / 42

Page 101: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

Achievability for DM Sources and Channels

Packing Lemma

Let (U,X, Y ) ∼ p(u, x, y) and Un ∼ p(un). Let Xn(m), m ∈ A, where|A| ≤ 2nR, be random sequences, each distributed according to∏n

i=1 pX|U (xi|ui) with arbitrary dependence on the rest

Let Y n ∈ Yn be another random sequence, conditionally independent ofeach Xn(m),m ∈ A, given Un, and distributed according to an arbitrarypmf p(yn|un)

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 26 / 42

Page 102: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

Achievability for DM Sources and Channels

Packing Lemma

Let (U,X, Y ) ∼ p(u, x, y) and Un ∼ p(un). Let Xn(m), m ∈ A, where|A| ≤ 2nR, be random sequences, each distributed according to∏n

i=1 pX|U (xi|ui) with arbitrary dependence on the rest

Let Y n ∈ Yn be another random sequence, conditionally independent ofeach Xn(m),m ∈ A, given Un, and distributed according to an arbitrarypmf p(yn|un)

Then, there exists δ(ǫ) → 0 as ǫ → 0 such that

P{(Un,Xn(m), Y n) ∈ T (n)ǫ for some m ∈ A} → 0

as n → ∞, if R < I(X;Y |U)− δ(ǫ)

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 26 / 42

Page 103: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

Achievability for DM Sources and Channels

The sequences Xn(m), m ∈ A, represent codewords. The Y n sequencerepresents the received sequence as a result of sending a codeword /∈ A

Xn(1)

Xn(m)

XnYn T

(n)ǫ (Y )

Y n

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 27 / 42

Page 104: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

Achievability for DM Sources and Channels

The sequences Xn(m), m ∈ A, represent codewords. The Y n sequencerepresents the received sequence as a result of sending a codeword /∈ A

Xn(1)

Xn(m)

XnYn T

(n)ǫ (Y )

Y n

The lemma shows that under any pmf on Y n the probability that somecodeword in A is jointly typical with Y n → 0 as n → ∞ if the rate of thecode R < I(X;Y |U)

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 27 / 42

Page 105: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

Achievability for DM Sources and Channels

Covering Lemma

Let (U,X, X) ∼ p(u, x, x). Let (Un,Xn) ∼ p(un, xn) be a pair ofarbitrarily distributed random sequences such that

P{(Un,Xn) ∈ T(n)ǫ (U,X)} → 1 as n → ∞

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 28 / 42

Page 106: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

Achievability for DM Sources and Channels

Covering Lemma

Let (U,X, X) ∼ p(u, x, x). Let (Un,Xn) ∼ p(un, xn) be a pair ofarbitrarily distributed random sequences such that

P{(Un,Xn) ∈ T(n)ǫ (U,X)} → 1 as n → ∞

Let Xn(m),m ∈ A, where |A| ≥ 2nR, be random sequences, conditionallyindependent of each other and of Xn given Un, and distributed accordingto

∏ni=1 pX|U(xi|ui)

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 28 / 42

Page 107: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

Achievability for DM Sources and Channels

Covering Lemma

Let (U,X, X) ∼ p(u, x, x). Let (Un,Xn) ∼ p(un, xn) be a pair ofarbitrarily distributed random sequences such that

P{(Un,Xn) ∈ T(n)ǫ (U,X)} → 1 as n → ∞

Let Xn(m),m ∈ A, where |A| ≥ 2nR, be random sequences, conditionallyindependent of each other and of Xn given Un, and distributed accordingto

∏ni=1 pX|U(xi|ui)

Then, there exists δ(ǫ) → 0 as ǫ → 0 such that

P{(Un,Xn, Xn(m)) /∈ T (n)ǫ for all m ∈ A} → 0

as n → ∞, if R > I(X; X |U) + δ(ǫ)

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 28 / 42

Page 108: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

Achievability for DM Sources and Channels

The sequences Xn(m), m ∈ A, represent reproduction sequences and Xn

represents the source sequence

replacements

Xn(1)

Xn(m)

XnXn T

(n)ǫ (X)

Xn

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 29 / 42

Page 109: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

Achievability for DM Sources and Channels

The sequences Xn(m), m ∈ A, represent reproduction sequences and Xn

represents the source sequence

replacements

Xn(1)

Xn(m)

XnXn T

(n)ǫ (X)

Xn

The lemma shows that if R > I(X; X |U) then there is at least onereproduction sequence that is jointly typical with Xn

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 29 / 42

Page 110: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

Achievability for DM Sources and Channels

Conditional Typicality Lemma

Let (X,Y ) ∼ p(x, y), xn ∈ T(n)ǫ′

(X), and Y n ∼∏n

i=1 pY |X(yi|xi).Then, for every ǫ > ǫ′,

P{(xn, Y n) ∈ T(n)ǫ (X,Y )} → 1 as n → ∞

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 30 / 42

Page 111: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

Achievability for DM Sources and Channels

Conditional Typicality Lemma

Let (X,Y ) ∼ p(x, y), xn ∈ T(n)ǫ′

(X), and Y n ∼∏n

i=1 pY |X(yi|xi).Then, for every ǫ > ǫ′,

P{(xn, Y n) ∈ T(n)ǫ (X,Y )} → 1 as n → ∞

Markov lemma is a special case: U → X → Y form a Markov chain.

If (un, xn) ∈ T(n)ǫ′

(U,X) and Y n ∼∏n

i=1 pY |X(yi|xi), then for everyǫ > ǫ′,

P{(un, xn, Y n) ∈ T(n)ǫ (U,X, Y )} → 1 as n → ∞

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 30 / 42

Page 112: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

Achievability for DM Sources and Channels

Gelfand–Pinsker

Consider a DMC with DM state (X × S, p(y|x, s)p(s),Y)

The sender X who knows the state sequence Sn noncausally andwishes to send a message M ∈ [1 : 2nR] to the receiver Y

M Xn Y n

MEncoder Decoder

p(s)

p(y|x, s)

Sn

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 31 / 42

Page 113: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

Achievability for DM Sources and Channels

Gelfand–Pinsker

Consider a DMC with DM state (X × S, p(y|x, s)p(s),Y)

The sender X who knows the state sequence Sn noncausally andwishes to send a message M ∈ [1 : 2nR] to the receiver Y

M Xn Y n

MEncoder Decoder

p(s)

p(y|x, s)

Sn

Gelfand–Pinsker Theorem

The capacity of a DMC with DM state available noncausally at theencoder is

CSI−E = maxp(u|s), x(u,s)

(I(U ;Y )− I(U ;S))

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 31 / 42

Page 114: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

Achievability for DM Sources and Channels

Outline of Achievability [Heegard, El Gamal]

Fix p(u|s), x(u, s) that achieve capacity. For each message

m ∈ [1 : 2nR], generate a subcode of 2n(R−R) un(l) sequences

sn

un

un(1)

un(2n(R−R))

un(2nR)

sn

C(1)

C(2)

C(3)

C(2nR)

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 32 / 42

Page 115: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

Achievability for DM Sources and Channels

Outline of Achievability [Heegard, El Gamal]

Fix p(u|s), x(u, s) that achieve capacity. For each message

m ∈ [1 : 2nR], generate a subcode of 2n(R−R) un(l) sequences

sn

un

un(1)

un(2n(R−R))

un(2nR)

sn

C(1)

C(2)

C(3)

C(2nR)

To send m given sn, find un(l) ∈ C(m) that is jointly typical with sn

and transmit xi = x(ui(l), si) for i ∈ [1 : n]

The receiver finds a jointly typical un with yn and declares thesubcode index m of un to be the message sent

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 32 / 42

Page 116: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

Achievability for DM Sources and Channels

Analysis of the Probability of Error

Assume M = 1 and let L be the index of the chosen Un codeword forM = 1 and Sn

We bound each probability of error event:

◮ E1 = {(Sn, Un(l)) /∈ T(n)ǫ′

for all Un(l) ∈ C(1)}:

By the covering lemma, P(E1) → 0 as n → ∞ if R −R > I(U ;S)

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 33 / 42

Page 117: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

Achievability for DM Sources and Channels

Analysis of the Probability of Error

Assume M = 1 and let L be the index of the chosen Un codeword forM = 1 and Sn

We bound each probability of error event:

◮ E1 = {(Sn, Un(l)) /∈ T(n)ǫ′

for all Un(l) ∈ C(1)}:

By the covering lemma, P(E1) → 0 as n → ∞ if R −R > I(U ;S)

◮ E2 = {(Un(L), Y n) /∈ T(n)ǫ }:

By the conditional typicality lemma, P(Ec1 ∩ E2) → 0 as n → ∞

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 33 / 42

Page 118: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

Achievability for DM Sources and Channels

Analysis of the Probability of Error

Assume M = 1 and let L be the index of the chosen Un codeword forM = 1 and Sn

We bound each probability of error event:

◮ E1 = {(Sn, Un(l)) /∈ T(n)ǫ′

for all Un(l) ∈ C(1)}:

By the covering lemma, P(E1) → 0 as n → ∞ if R −R > I(U ;S)

◮ E2 = {(Un(L), Y n) /∈ T(n)ǫ }:

By the conditional typicality lemma, P(Ec1 ∩ E2) → 0 as n → ∞

◮ E3 = {(Un(l), Y n) ∈ T(n)ǫ for some Un(l) /∈ C(1)}

Since each Un(l) /∈ C(1) is independent of Y n and generated accordingto

∏n

i=1 pU (ui), by the packing lemma, P(E3) → 0 as n → ∞ if

R < I(U ;Y )

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 33 / 42

Page 119: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

Achievability for DM Sources and Channels

Analysis of the Probability of Error

Assume M = 1 and let L be the index of the chosen Un codeword forM = 1 and Sn

We bound each probability of error event:

◮ E1 = {(Sn, Un(l)) /∈ T(n)ǫ′

for all Un(l) ∈ C(1)}:

By the covering lemma, P(E1) → 0 as n → ∞ if R −R > I(U ;S)

◮ E2 = {(Un(L), Y n) /∈ T(n)ǫ }:

By the conditional typicality lemma, P(Ec1 ∩ E2) → 0 as n → ∞

◮ E3 = {(Un(l), Y n) ∈ T(n)ǫ for some Un(l) /∈ C(1)}

Since each Un(l) /∈ C(1) is independent of Y n and generated accordingto

∏n

i=1 pU (ui), by the packing lemma, P(E3) → 0 as n → ∞ if

R < I(U ;Y )

Thus the probability or error → 0 as n → ∞ if R < I(U ;Y )− I(U ;S)

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 33 / 42

Page 120: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

Achievability for DM Sources and Channels

Mutual Covering Lemma [El Gamal, van der Meulen]Let (U1, U2) ∼ p(u1, u2). For j = 1, 2, let Un

j (mj),mj ∈ [1 : 2nRj ], bepairwise independent random sequences, each distributed according to∏n

i=1 pUj(uji). Assume that {Un

1 (m1) : m1 ∈ [1 : 2nR1 ]} and{Un

2 (m2) : m2 ∈ [1 : 2nR2 ]} are independent

Un

1 (1)

Un

1 (2)

Un

1 (2nR1 )

Un 2(1

)

Un 2(2

)

Un 2(2

nR

2)

(Un

1 (m1), Un

2 (m2) ∈ T(n)ǫ

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 34 / 42

Page 121: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

Achievability for DM Sources and Channels

Mutual Covering Lemma [El Gamal, van der Meulen]

Let (U1, U2) ∼ p(u1, u2). For j = 1, 2, let Unj (mj),mj ∈ [1 : 2nRj ], be

pairwise independent random sequences, each distributed according to∏ni=1 pUj

(uji). Assume that {Un1 (m1) : m1 ∈ [1 : 2nR1 ]} and

{Un2 (m2) : m2 ∈ [1 : 2nR2 ]} are independent

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 35 / 42

Page 122: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

Achievability for DM Sources and Channels

Mutual Covering Lemma [El Gamal, van der Meulen]

Let (U1, U2) ∼ p(u1, u2). For j = 1, 2, let Unj (mj),mj ∈ [1 : 2nRj ], be

pairwise independent random sequences, each distributed according to∏ni=1 pUj

(uji). Assume that {Un1 (m1) : m1 ∈ [1 : 2nR1 ]} and

{Un2 (m2) : m2 ∈ [1 : 2nR2 ]} are independent

Then, there exists δ(ǫ) → 0 as ǫ → 0 such that

P{(Un1 (m1), U

n2 (m2)) /∈ T (n)

ǫ for all (m1,m2)} → 0

as n → ∞ if R1 +R2 > I(U1;U2)

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 35 / 42

Page 123: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

Achievability for DM Sources and Channels

Mutual Covering Lemma [El Gamal, van der Meulen]

Let (U1, U2) ∼ p(u1, u2). For j = 1, 2, let Unj (mj),mj ∈ [1 : 2nRj ], be

pairwise independent random sequences, each distributed according to∏ni=1 pUj

(uji). Assume that {Un1 (m1) : m1 ∈ [1 : 2nR1 ]} and

{Un2 (m2) : m2 ∈ [1 : 2nR2 ]} are independent

Then, there exists δ(ǫ) → 0 as ǫ → 0 such that

P{(Un1 (m1), U

n2 (m2)) /∈ T (n)

ǫ for all (m1,m2)} → 0

as n → ∞ if R1 +R2 > I(U1;U2)

Used in the proof of Marton inner bound for BC

Can be extended to k variables. Extension used in the proof ofEl Gamal–Cover inner bound for multiple descriptions and forextending Marton inner bound to k receivers

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 35 / 42

Page 124: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

Achievability for DM Sources and Channels

Mutual Packing Lemma

Let (U1, U2) ∼ p(u1, u2). For j = 1, 2, let Unj (mj), mj ∈ [1 : 2nRj ], be

random sequences, each distributed according to∏n

i=1 pUj(uji) with

arbitrary dependence on the rest of the Unj (mj) sequences. Assume that

{Un1 (m1) : m1 ∈ [1 : 2nR1 ]} and {Un

2 (m2) : m2 ∈ [1 : 2nR2 ]} areindependent

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 36 / 42

Page 125: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

Achievability for DM Sources and Channels

Mutual Packing Lemma

Let (U1, U2) ∼ p(u1, u2). For j = 1, 2, let Unj (mj), mj ∈ [1 : 2nRj ], be

random sequences, each distributed according to∏n

i=1 pUj(uji) with

arbitrary dependence on the rest of the Unj (mj) sequences. Assume that

{Un1 (m1) : m1 ∈ [1 : 2nR1 ]} and {Un

2 (m2) : m2 ∈ [1 : 2nR2 ]} areindependent

Then, there exists δ(ǫ) → 0 as ǫ → 0 such that

P{(Un1 (m1), U

n2 (m2)) ∈ T (n)

ǫ for some (m1,m2)} → 0

as n → ∞ if R1 +R2 < I(U1;U2)− δ(ǫ)

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 36 / 42

Page 126: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

Achievability for DM Sources and Channels

Mutual Packing Lemma

Let (U1, U2) ∼ p(u1, u2). For j = 1, 2, let Unj (mj), mj ∈ [1 : 2nRj ], be

random sequences, each distributed according to∏n

i=1 pUj(uji) with

arbitrary dependence on the rest of the Unj (mj) sequences. Assume that

{Un1 (m1) : m1 ∈ [1 : 2nR1 ]} and {Un

2 (m2) : m2 ∈ [1 : 2nR2 ]} areindependent

Then, there exists δ(ǫ) → 0 as ǫ → 0 such that

P{(Un1 (m1), U

n2 (m2)) ∈ T (n)

ǫ for some (m1,m2)} → 0

as n → ∞ if R1 +R2 < I(U1;U2)− δ(ǫ)

Used in the proof of the Berger–Tung inner bound for distributedlossy source coding

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 36 / 42

Page 127: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

Gaussian Sources and Channels

Gaussian Sources and Channels

Because Gaussian models are quite popular in wirelesscommunication, we have complete coverage of all basic results

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 37 / 42

Page 128: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

Gaussian Sources and Channels

Gaussian Sources and Channels

Achievability:

1. Show that Gaussian optimizes mutual information expressions2. Prove achievability of optimized expressions via DM counterpart (with

cost) by discretization and taking appropriate limits

The second step is detailed only for AWGN channel and quadraticGaussian source coding

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 37 / 42

Page 129: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

Gaussian Sources and Channels

Gaussian Sources and Channels

Treatment of Gaussian is interspersed within each lecture,

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 37 / 42

Page 130: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

Gaussian Sources and Channels

Gaussian Sources and Channels

Treatment of Gaussian is interspersed within each lecture, e.g., theinterference channel lecture:

◮ Inner and outer bounds on capacity region of DM-IC

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 37 / 42

Page 131: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

Gaussian Sources and Channels

Gaussian Sources and Channels

Treatment of Gaussian is interspersed within each lecture, e.g., theinterference channel lecture:

◮ Inner and outer bounds on capacity region of DM-IC◮ Capacity region of DM-IC under strong interference

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 37 / 42

Page 132: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

Gaussian Sources and Channels

Gaussian Sources and Channels

Treatment of Gaussian is interspersed within each lecture, e.g., theinterference channel lecture:

◮ Inner and outer bounds on capacity region of DM-IC◮ Capacity region of DM-IC under strong interference◮ AWGN-IC

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 37 / 42

Page 133: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

Gaussian Sources and Channels

Gaussian Sources and Channels

Treatment of Gaussian is interspersed within each lecture, e.g., theinterference channel lecture:

◮ Inner and outer bounds on capacity region of DM-IC◮ Capacity region of DM-IC under strong interference◮ AWGN-IC◮ Capacity region of AWGN-IC under strong interference

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 37 / 42

Page 134: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

Gaussian Sources and Channels

Gaussian Sources and Channels

Treatment of Gaussian is interspersed within each lecture, e.g., theinterference channel lecture:

◮ Inner and outer bounds on capacity region of DM-IC◮ Capacity region of DM-IC under strong interference◮ AWGN-IC◮ Capacity region of AWGN-IC under strong interference◮ Han–Kobayashi inner bound for DM-IC

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 37 / 42

Page 135: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

Gaussian Sources and Channels

Gaussian Sources and Channels

Treatment of Gaussian is interspersed within each lecture, e.g., theinterference channel lecture:

◮ Inner and outer bounds on capacity region of DM-IC◮ Capacity region of DM-IC under strong interference◮ AWGN-IC◮ Capacity region of AWGN-IC under strong interference◮ Han–Kobayashi inner bound for DM-IC◮ Capacity region of a Class of deterministic DM-IC

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 37 / 42

Page 136: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

Gaussian Sources and Channels

Gaussian Sources and Channels

Treatment of Gaussian is interspersed within each lecture, e.g., theinterference channel lecture:

◮ Inner and outer bounds on capacity region of DM-IC◮ Capacity region of DM-IC under strong interference◮ AWGN-IC◮ Capacity region of AWGN-IC under strong interference◮ Han–Kobayashi inner bound for DM-IC◮ Capacity region of a Class of deterministic DM-IC◮ Capacity region of AWGN-IC within Half a Bit

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 37 / 42

Page 137: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

Gaussian Sources and Channels

Gaussian Sources and Channels

Treatment of Gaussian is interspersed within each lecture, e.g., theinterference channel lecture:

◮ Inner and outer bounds on capacity region of DM-IC◮ Capacity region of DM-IC under strong interference◮ AWGN-IC◮ Capacity region of AWGN-IC under strong interference◮ Han–Kobayashi inner bound for DM-IC◮ Capacity region of a Class of deterministic DM-IC◮ Capacity region of AWGN-IC within Half a Bit◮ Sum-capacity of AWGN-IC under weak interference

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 37 / 42

Page 138: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

Converse

Converse

The lectures discuss only weak converses

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 38 / 42

Page 139: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

Converse

Converse

The lectures discuss only weak converses

The tools are introduced gradually:◮ DMC: Fano’s inequality; convexity (data processing inequality);

Markovity (memoryless)◮ AWGN: Gaussian optimizes differential entropy under power constraint

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 38 / 42

Page 140: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

Converse

Converse

The lectures discuss only weak converses

The tools are introduced gradually:◮ DMC: Fano’s inequality; convexity (data processing inequality);

Markovity (memoryless)◮ AWGN: Gaussian optimizes differential entropy under power constraint◮ MAC: Time sharing random variable

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 38 / 42

Page 141: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

Converse

Converse

The lectures discuss only weak converses

The tools are introduced gradually:◮ DMC: Fano’s inequality; convexity (data processing inequality);

Markovity (memoryless)◮ AWGN: Gaussian optimizes differential entropy under power constraint◮ MAC: Time sharing random variable◮ Degraded BC: Gallager’s identification of auxiliary random variable;

bounding cardinality◮ Binary Symmetric BC: Mrs. Gerber’s lemma◮ AWGN-BC: EPI◮ More capable/less noisy BC: Csiszar’s sum identity

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 38 / 42

Page 142: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

Converse

Converse

The lectures discuss only weak converses

The tools are introduced gradually:◮ DMC: Fano’s inequality; convexity (data processing inequality);

Markovity (memoryless)◮ AWGN: Gaussian optimizes differential entropy under power constraint◮ MAC: Time sharing random variable◮ Degraded BC: Gallager’s identification of auxiliary random variable;

bounding cardinality◮ Binary Symmetric BC: Mrs. Gerber’s lemma◮ AWGN-BC: EPI◮ More capable/less noisy BC: Csiszar’s sum identity◮ Strong interference: Extension of more capable from scalar to vectors◮ Deterministic IC: Genie◮ Weak interference: Gaussian worst noise

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 38 / 42

Page 143: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

Converse

Converse

The lectures discuss only weak converses

The tools are introduced gradually:◮ DMC: Fano’s inequality; convexity (data processing inequality);

Markovity (memoryless)◮ AWGN: Gaussian optimizes differential entropy under power constraint◮ MAC: Time sharing random variable◮ Degraded BC: Gallager’s identification of auxiliary random variable;

bounding cardinality◮ Binary Symmetric BC: Mrs. Gerber’s lemma◮ AWGN-BC: EPI◮ More capable/less noisy BC: Csiszar’s sum identity◮ Strong interference: Extension of more capable from scalar to vectors◮ Deterministic IC: Genie◮ Weak interference: Gaussian worst noise◮ Vector Gaussian BC: MAC/BC duality; convex optimization◮ Quadratic Gaussian distributed coding: MMSE

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 38 / 42

Page 144: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

Extension to Networks

Extension to Networks

The lectures include extensions (or lack thereof) of results for ≤ 3users to networks

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 39 / 42

Page 145: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

Extension to Networks

Extension to Networks

The lectures include extensions (or lack thereof) of results for ≤ 3users to networks

In some rare cases the results extend naturally to many users:◮ MAC◮ Degraded BC◮ MIMO BC◮ Slepian–Wolf

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 39 / 42

Page 146: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

Extension to Networks

Extension to Networks

The lectures include extensions (or lack thereof) of results for ≤ 3users to networks

In some rare cases the results extend naturally to many users:◮ MAC◮ Degraded BC◮ MIMO BC◮ Slepian–Wolf

In most cases the results don’t extend and naive extensions of resultsfor ≤ 3 users can be improved using new coding techniques

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 39 / 42

Page 147: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

Extension to Networks

Extension to Networks

The lectures include extensions (or lack thereof) of results for ≤ 3users to networks

In some rare cases the results extend naturally to many users:◮ MAC◮ Degraded BC◮ MIMO BC◮ Slepian–Wolf

In most cases the results don’t extend and naive extensions of resultsfor ≤ 3 users can be improved using new coding techniques

The lectures provide several examples of such cases

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 39 / 42

Page 148: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

Extension to Networks

Some Interesting Extensions

Inner bound for DM-BC with degraded message sets for 3 receivers

Marton for ≥ 3 receivers

General BC inner bound construction for ≥ 3 receivers

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 40 / 42

Page 149: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

Extension to Networks

Some Interesting Extensions

Inner bound for DM-BC with degraded message sets for 3 receivers

Marton for ≥ 3 receivers

General BC inner bound construction for ≥ 3 receivers

Network coding for multicast noiseless networks and special cases

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 40 / 42

Page 150: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

Extension to Networks

Some Interesting Extensions

Inner bound for DM-BC with degraded message sets for 3 receivers

Marton for ≥ 3 receivers

General BC inner bound construction for ≥ 3 receivers

Network coding for multicast noiseless networks and special cases

Decode–forward

Compress–forward

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 40 / 42

Page 151: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

Extension to Networks

Some Interesting Extensions

Inner bound for DM-BC with degraded message sets for 3 receivers

Marton for ≥ 3 receivers

General BC inner bound construction for ≥ 3 receivers

Network coding for multicast noiseless networks and special cases

Decode–forward

Compress–forward

Slepian–Wolf over noiseless broadcast network (CFO problem)

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 40 / 42

Page 152: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

Extension to Networks

Some Interesting Extensions

Inner bound for DM-BC with degraded message sets for 3 receivers

Marton for ≥ 3 receivers

General BC inner bound construction for ≥ 3 receivers

Network coding for multicast noiseless networks and special cases

Decode–forward

Compress–forward

Slepian–Wolf over noiseless broadcast network (CFO problem)

Wiretap channel with > 2 receivers; key generation for many sources

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 40 / 42

Page 153: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

Extension to Networks

Some Interesting Extensions

Inner bound for DM-BC with degraded message sets for 3 receivers

Marton for ≥ 3 receivers

General BC inner bound construction for ≥ 3 receivers

Network coding for multicast noiseless networks and special cases

Decode–forward

Compress–forward

Slepian–Wolf over noiseless broadcast network (CFO problem)

Wiretap channel with > 2 receivers; key generation for many sources

Several cutset bounds for various types of networks

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 40 / 42

Page 154: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

Extension to Networks

Some Interesting Extensions

Inner bound for DM-BC with degraded message sets for 3 receivers

Marton for ≥ 3 receivers

General BC inner bound construction for ≥ 3 receivers

Network coding for multicast noiseless networks and special cases

Decode–forward

Compress–forward

Slepian–Wolf over noiseless broadcast network (CFO problem)

Wiretap channel with > 2 receivers; key generation for many sources

Several cutset bounds for various types of networks

Scaling laws and high SNR approximations for Gaussian networks

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 40 / 42

Page 155: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

Conclusion

Conclusion

Lectures on NIT:◮ Top-down organization◮ Balances introduction of new tools and models◮ Elementary tools and proof techniques for most material◮ Unified approach to achievability◮ Comprehensive coverage of key results◮ Extensions to networks

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 41 / 42

Page 156: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

Conclusion

Conclusion

Lectures on NIT:◮ Top-down organization◮ Balances introduction of new tools and models◮ Elementary tools and proof techniques for most material◮ Unified approach to achievability◮ Comprehensive coverage of key results◮ Extensions to networks

Some of the basic material ready to be included in graduate commcourse sequences (with introductory IT course as prereq)

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 41 / 42

Page 157: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

Conclusion

Conclusion

Lectures on NIT:◮ Top-down organization◮ Balances introduction of new tools and models◮ Elementary tools and proof techniques for most material◮ Unified approach to achievability◮ Comprehensive coverage of key results◮ Extensions to networks

Some of the basic material ready to be included in graduate commcourse sequences (with introductory IT course as prereq)

We plan to make the teaching subset of the lectures available earlynext year

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 41 / 42

Page 158: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

Conclusion

Acknowledgments

Many people have contributed to the development of the lecturesover the years:

◮ Many of my graduate students◮ My course TAs◮ Students that took the class

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 42 / 42

Page 159: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

Conclusion

Acknowledgments

Many people have contributed to the development of the lecturesover the years:

◮ Many of my graduate students◮ My course TAs◮ Students that took the class

Tom Cover has been an inspiring and encouraging figure throughout

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 42 / 42

Page 160: Lectures on Network Information TheoryAbbas El Gamal StanfordUniversity Allerton 2009 A.ElGamal (Stanford University) LecturesonNIT Allerton2009 1/42 TheEarlyYears I started a course

Conclusion

Acknowledgments

Many people have contributed to the development of the lecturesover the years:

◮ Many of my graduate students◮ My course TAs◮ Students that took the class

Tom Cover has been an inspiring and encouraging figure throughout

Partial financial support from NSF and DARPA ITMANET

A. El Gamal (Stanford University) Lectures on NIT Allerton 2009 42 / 42