Upload
others
View
3
Download
0
Embed Size (px)
Citation preview
Kiung Jung, [email protected]
13th ICACT Tutorial – Cooperative communication fundamentals & coding techniques
Cooperative Communication Fundamentals & Coding Techniques
2011.2.14
Electronics and Telecommunication Research Institute
Kiung Jung
Kiung Jung, [email protected]
13th ICACT Tutorial – Cooperative communication fundamentals & coding techniques
Coverage
1. Preparatory Overview of Information Theory Basics• Information theory basics & channel capacity• Network information theory basics• Cooperative communication examples
- Simulcast transmission- Relay transmission and iterative diversity
Break
2. Coding Techniques for Cooperative Communication• Principal design tools• Cooperative coding schemes in wireless cooperative channels
Kiung Jung, [email protected]
Section 1. Preparatory Overview
Section 1
Preparatory Overview of Information Theory Basics
Kiung Jung, [email protected]
Section 1. Preparatory Overview – Introduction
4
Classically, it refers to the relay channel
We consider it in a broader aspect as multi-terminal
communications:
The terminals jointly use the medium as shared
resource instead of one that is divided orthogonally among pairs
of terminals.
What is cooperative communication?
Kiung Jung, [email protected]
Section 1. Preparatory Overview – Introduction
5
Wireless networks increasingly take places important in
modern communications.
Typical wireless channel usage: Avoiding interference
among users by using orthogonally divided channels.
Capacity approaching process – Information theory
based resource sharing among multiple terminals rather
than using orthogonally divided channels.
Motivation
Kiung Jung, [email protected]
Section 1. Preparatory Overview – Introduction
6
Classically, it refers to the relay channel
We consider it in a broader aspect as multi-terminal
communications:
The terminals jointly use the medium as shared
resource instead of one that is divided orthogonally
among pairs of terminals.
Coverage of Cooperative Communication
Kiung Jung, [email protected]
Section 1. Preparatory Overview – Introduction
7
An example: T.M. Cover’s BroadcastChannel, 1972
Kiung Jung, [email protected]
Section 1. Preparatory Overview – Introduction
8
Information theoretical consideration on cooperation has
been started since the early 1970s.
It did NOT attract significant interests until 2003 and
beyond.
Why Now?
Kiung Jung, [email protected]
Section 1. Preparatory Overview – Introduction
9
Importance of mobile communication
- Fading channel
Importance of networked communication
- Multiple communications
Ability to implementation
- Signal processing capability
- Modern error-correction codes
Why Now?
Kiung Jung, [email protected]
Section 1. Preparatory Overview – Information theory basics & channel capacity
10
Entropy
- Uncertainty
- Average amount of information per source output
(* “You became a father!!” )
XpE
xpxpXH
p
Xx
1log
log
* From Thomas Cover et al., Elements of Information Theory
Review: Information Theory Basics
-> Function of the distribution X
Kiung Jung, [email protected]
Section 1. Preparatory Overview – Information theory basics & channel capacity
11
Entropy
- A measure of uncertainty of a random variable.
- Function of distribution of a random variable, which depends only
on the probabilities.
- Average amount of information per source output.
-> Channel capacity
- Average length of the shortest description of the random variable
-> Expected description length must be greater than or equal to
the entropy.
-> Data compression
Review: Information Theory Basics
Kiung Jung, [email protected]
Section 1. Preparatory Overview – Information theory basics & channel capacity
12
By the definition of the entropy
Joint Entropy
Review: Information Theory Basics
YXpE
yxpyxpYXH
p
x y
,
1log
,log,,
Conditional Entropy
YXpE
yxpyxpYXH
p
x y
|
1log
|log,|
Kiung Jung, [email protected]
Section 1. Preparatory Overview – Information theory basics & channel capacity
13
Mutual Information
A measure of the amount of information that one random variable
contains about another random variable.
Review: Information Theory Basics
YpXp
YXpE
ypxp
yxpyxpYXI
yxp
x y
,log
,log,;
,
Then,
YXHXHYXI |;
-> The reduction in the uncertainty of X due to the knowledge of Y.
YXIC
xp;max
Channel Capacity
Kiung Jung, [email protected]
Section 1. Preparatory Overview – Information theory basics & channel capacity
14
The Asymptotic Equipartition Property (AEP)
- The analog of the law of large numbers.
- The law of large numbers states that for i.i.d. random variables,
Review: Information Theory Basics
nEXXn
n
i
i largefor ,1
1
- The AEP states that;
nH
n
nn
n
n
XXXp
XXXXXXp
diiXXXHXXXpn
2,....,, So,
,....,, sequence theobserving ofy Probabilit : ,....,,
... are ,....,, where,,....,,
1log
1
21
2121
21
21
- This arises the typical set in all sequences where the sample
entropy is close to true entropy.
Kiung Jung, [email protected]
Section 1. Preparatory Overview – Information theory basics & channel capacity
15
Typical Set,
- Typical sequences determine the average behavior of a large
sample with high probability.
- The number of elements in the typical set is nearly
Review: Information Theory Basics
nH2
nA
2
XHnnA
where is a arbitrary small number according to an appropriate
choice of n. elements :nn
set typical-Non
elements 2: sets, Typical Hn
nA
Kiung Jung, [email protected]
Section 1. Preparatory Overview – Information theory basics & channel capacity
16
Theorem 3.2.1 (T. M. Cover, p54)
- Let be . Let . Then there exists a code which
maps sequences of length into binary strings such that the
mapping is one-to-one (and therefore invertible) and
Review: Information Theory Basics
0nX xpdii ~...
nx n
XHXl
nE n1
For n sufficiently large,
nn
n
n
xxl
xxxx
toingcorrespond codeword th oflength :
,......,, sequence a denote : 21
- Thus we can represent sequences using bits on the average.
nX XnH
Kiung Jung, [email protected]
Section 1. Preparatory Overview – Information theory basics & channel capacity
17
Information Channel Capacity
Review: Channel Capacity
YXIC
xp;max
Operational Channel Capacity
- Highest rate in bits per channel use at which information can be
sent with arbitrarily low probability of error.
Shannon’s second theorem establishes
Inform. Channel Capacity = Oper. Channel Capacity
Kiung Jung, [email protected]
Section 1. Preparatory Overview – Information theory basics & channel capacity
18
Shannon’s second theorem
- Operational meaning to the definition of capacity as the number
of bits we can transmit reliably over the channel.
Basic Idea
- For each (typical) input n-sequences, there are approximately
possible Y sequences, all of them equally likely.
- The total number of possible (typical) Y sequences is .
- This set has to be divided into sets of size corresponding
to the different input sequences.
Review: Channel Capacity
XYnH |2
W WnX nY
YnH2 XYnH |2
X
Kiung Jung, [email protected]
Section 1. Preparatory Overview – Information theory basics & channel capacity
19
Basic Idea (continued)
- The total number of disjoint sets is .
- Hence we can send at most distinguishable sequences of
length .
Theorem 8.7.1 (T. M. Cover, p198)
The channel coding theorem: All rates below capacity are
achievable. Specifically, for every rate , there exists a
sequence of codes with maximum probability of error
Conversely, any sequence of codes with must have
Review: Channel Capacity
YXnIXYHYHn ;| 22
YXnI ;2
n
CR
C
nnR ,2 0n
nnR ,2 0n
YXICR
xp;max
Kiung Jung, [email protected]
Section 1. Preparatory Overview – Information theory basics & channel capacity
20
Gaussian Channel
Review: Channel Capacity
- The most common limitation on the input is power constraint.
- Input and output random variables are continuous.
n
i
i
iiii
Pxn
NZZXY
1
2
2
1
,0~ ,
- Differential entropy of a continuous random variable; Xh
XxfdxxfxfXhS
for pdf ~ ,log
- For a normal distribution, (T. M. Cover, p225)
bits 2log2
1
2
1~ 22
2
22
eheXX x
X Y
Kiung Jung, [email protected]
Section 1. Preparatory Overview – Information theory basics & channel capacity
21
Information Capacity in Gaussian Channel
Review: Channel Capacity
tindependen ~, ,
)|(
|
|;
ZXZhYh
XZhYh
XZXhYh
XYhYhYXI
NPEZEXZXEEYeNZh 2222 ,2log2
1 where,
YXIC
PEXxp;max
2:
PPX i ,~
NZi ,0~
iY
With power constraint ,PEX 2
.1log2
1
2log2
12log
2
1
;
N
P
eNNPe
ZhYhYXI
Kiung Jung, [email protected]
Section 1. Preparatory Overview – Information theory basics & channel capacity
22
Information Capacity in Gaussian Channel
Review: Channel Capacity
issionper transm bits 1log
2
1;max
2:
N
PYXIC
PEXxp
A rate R is said to be achievable for a Gaussian
channel with a power constraint P if there exists a
sequence of codes with code words satisfying
the power constraint such that the maximal
probability of error tends to zero.
CR
nnR ,2
Kiung Jung, [email protected]
Section 1. Preparatory Overview – Information theory basics & channel capacity
23
Band-Limited Gaussian Channel
- A common model for communication over a radio network is a
band-limited with white noise.
Nyquist-Shannon sampling theorem
- Sampling a band-limited signal at a sampling rate is sufficient
to reconstruct the signal from the samples.
Review: Channel Capacity
W2
1
Kiung Jung, [email protected]
Section 1. Preparatory Overview – Information theory basics & channel capacity
24
So, for a Band-Limited Gaussian Channel,
if Nose PSD ~ , Bandwidth , Time interval
Review: Channel Capacity
2
0NW T,0
samples.per bits 1log2
1
221log
2
1
issionper transm bits 1log2
1
0
0
WN
PN
W
P
N
PC
Since there are samples each second, W2
second.per bits 1log0
WN
PWC
Kiung Jung, [email protected]
Section 1. Preparatory Overview – Information theory basics & channel capacity
25
Review: Channel Capacity Shannon limit
The limiting value of Eb/No below which there can be no error-free
communication at any information rate
Bandwidth Efficiency
A measure of how much data can be communicated in a specified
bandwidth within a given time.
W
R
0 5 10 15 20 25 30 35 4010
-1
100
101
Bit to noise ratio, Eb/No dB
Bandw
idth
eff
icie
ncy,
Rb/W
bit/s
ec/H
z
The Bandwidth Efficiency Diagram
ShannonLimit
Capacity boundaryR=C
Kiung Jung, [email protected]
Section 1. Preparatory Overview – Network information theory basics
26
Multiple Access Channel (MAC)
Two or more senders and a common receiver.
Definition: A rate pair is said to be achievable for the
multiple access channel if there exists a sequence of
codes with .
21, RR
nnRnR
,2,2 21
0n
eP
Kiung Jung, [email protected]
Section 1. Preparatory Overview – Network information theory basics
27
Multiple Access Channel (MAC)
Capacity region
The closure of the convex hull of the set of points satisfying; 21, RR
;,
|;
|;
2121
122
211
YXXIRR
XYXIR
XYXIR
2R
1R
12 |; XYXI
21 |; XYXI
YXI ;2
YXI ;1
A
D C
B
Kiung Jung, [email protected]
Section 1. Preparatory Overview – Network information theory basics
28
Multiple Access Channel (MAC)
Point A
Maximum rate achievable from sender 1 when sender 2 is not
sending any information.
2R
1R
12 |; XYXI
21 |; XYXI
YXI ;2
YXI ;1
A
D C
B
Point B
- Maximum rate sender 2 can
send as long as sender 1
sends at this maximum rate.
- is considered as noise
for the channel
211 |;maxmax
2211
XYXIRxpxp
1X
YX 2
Kiung Jung, [email protected]
Section 1. Preparatory Overview – Network information theory basics
29
Multiple Access Channel (MAC)
Gaussian Multiple Access channel
- For some input distribution ,
2R
1R
N
PC 2
N
PC 1
NP
PC
1
2
NP
PC
2
1
2211 xfxf 2
2
2
2
2
1
2
1 and PEXPEX
- Define channel capacity function with signal to noise ratio x
,1log2
1xxC
Then,
.
,
,
2121
22
11
N
PPCRR
N
PCR
N
PCR
Kiung Jung, [email protected]
Section 1. Preparatory Overview – Network information theory basics
30
Suppose there are two sources
What if the X-source and the Y-source must be separately?
Encoding of Correlated Sources
YXHRR
XYHR
YXHR
,
,|
,|
21
2
1
Kiung Jung, [email protected]
Section 1. Preparatory Overview – Network information theory basics
31
Random binning procedure.
Very similar to hash functions.
With high probability, different source sequences have different
indices, and we can recover the source sequence from the index.
Slepian-Wolf Encoding
Rate region for Slepian-Wolf encoding.
Kiung Jung, [email protected]
Section 1. Preparatory Overview – Network information theory basics
32
Broadcast Channel
One sender and two or more receivers.
TV station – Superposition of information
We may wish to arrange the information in such a way that the
better receivers receive extra information, which produces a better
picture or sound.
Kiung Jung, [email protected]
Section 1. Preparatory Overview – Network information theory basics
33
• Conceptual example
Broadcast Diversity
Kiung Jung, [email protected]
Section 1. Preparatory Overview – Network information theory basics
34
Broadcast Diversity
Kiung Jung, [email protected]
Section 1. Preparatory Overview – Network information theory basics
35
Relay Channel
One sender and one receiver with a number of intermediate nodes.
The relay channel combines a broadcast channel and a multiple
access channel.
111
,
|,;,;,min sup1
XYYXIYXXICxxp
Kiung Jung, [email protected]
Section 1. Preparatory Overview – Cooperative communication examples
36
Example I: Relay Transmission
Decode-and-forward, Laneman, 2002
* Diversity channel is created only when the relay decodes successfully.
Kiung Jung, [email protected]
Section 1. Preparatory Overview – Cooperative communication examples
37
Example I: Relay Transmission
Amplify-and-forward, Laneman, 2002
* The relay also amplifies its own receiver noise.
Kiung Jung, [email protected]
Section 1. Preparatory Overview – Cooperative communication examples
38
Example I: Relay Transmission
Coded cooperation, T. E. Hunter, 2002
* Rate-Compatible Punctured Convolutional code (RCPC) based
Kiung Jung, [email protected]
Section 1. Preparatory Overview – Cooperative communication examples
39
Example I: Relay Transmission
Cooperation using turbo codes, Zhao and Valenti, 2003
* Two separate Recursive Systematic Convolutional (RSC) encoders constitutes turbo encoder followed by iterative decoding between the relay and the destination
“Decode and permute”
Kiung Jung, [email protected]
Section 1. Preparatory Overview – Cooperative communication examples
40
Example I: Iterative Diversity
Collaborative decoding, Arun and J. M. Shea, 2006
- System model for collaborative decoding
- Principle of collaborative decoding with two nodes
Kiung Jung, [email protected]
Section 1. Preparatory Overview – Cooperative communication examples
41
Example I: Iterative Diversity
3 iterations, 5% LRB exchange, 900 packet size
Performance of two collaborative decoding schemes in which receivers request information for a set of least-reliable bits.
Kiung Jung, [email protected]
Section 1. Preparatory Overview – Cooperative communication examples
42
Example II: Simulcast Transmission
Simulcast by using Non-Uniform QPSK UEP signaling,
K. Jung & J. Shea, 2005
b1: Basic messageb2: Additional message
• Typical required bit error probability- for voice: 10-2
- for data: 10-4
Kiung Jung, [email protected]
Section 1. Preparatory Overview – Cooperative communication examples
43
Example II: Simulcast Transmission
• Example of transmission range calculation- Signaling: 4-PSK- Degradation: 0.3dB (θ=15o)- Disparity: 11.4dB (PB=PA)- Original transmission range by BPSK: 100m- Propagation constant, n: 4
Kiung Jung, [email protected]
Section 1. Preparatory Overview – Cooperative communication examples
44
Example II: Simulcast Transmission
Simulcast by using Non-Uniform QPSK UEP signaling
Simulation results of ETE throughput at n=4, Poisson distribution of H(θ)
0ete
ete
S
SG