View
223
Download
5
Embed Size (px)
Citation preview
Code and Decoder Design of LDPC Codes for Gbps Systems
Jeremy Thorpe
Presented to: Microsoft Research 2002.11.25
Talk Overview
Introduction (13 slides) Wiring Complexity ( 9 slides) Logic Complexity (7 slides)
Reliable Communication over Unreliable Channels
Channel is the means by which information is communicated from sender to receiver
Sender chooses X Channel generates Y from
conditional probability distribution P(Y|X)
Receiver observes Y
channel
P(Y|X) YX
Shannon’s Channel Coding Theorem
Using the channel n times, we can communicate k bits of information with probability of error as small as we like as long as
as long as n is large enough. C is a number that characterizes any channel.
The same is impossible if R>C.
Cn
kR
The Coding Strategy
Encoder chooses the mth codeword in codebook C and transmits it across the channel
Decoder observes the channel output y and generates m’ based on the knowledge of the codebook C and the channel statistics.
Decoder
Encoder
Channel
k}1,0{'m nYy
nXC xk}1,0{m
Linear Codes
A linear code C can be defined in terms of either a generator matrix or parity-check matrix.
Generator matrix G (k×n)
Parity-check matrix H (n-k×n)
}{mGC
}':{ 0cHc C
Regular LDPC Codes
LDPC Codes linear codes defined in terms of H. The number of ones in each column of H is a fixed
number λ. The number of ones in each row of H is a fixed
number ρ. Typical parameters for Regular LDPC codes are
(λ,ρ)=(3,6).
Graph Representation of LDPC Codes
H is represented by a bipartite graph.
nodes v (degree λ) on the left represent variables.
Nodes c (degree ρ)on the right represent equations:
0|
cv
vx
. . . . .
.Variable nodes
Check nodes
Message-Passing Decoding of LDPC Codes
Message Passing (or Belief Propagation) decoding is a low-complexity algorithm which approximately answers the question “what is the most likely x given y?”
MP recursively defines messages mv,c(i) and
mc,v(i) from each node variable node v to each
adjacent check node c, for iteration i=0,1,...
Two Types of Messages...
Liklihood Ratio
For y1,...yn independent conditionally on x:
Probability Difference
For x1,...xn independent:
)0|(
)1|(,
xyp
xypyx )|0()|1(, yxpyxpyx
i
yxyx in ,, 1
i
yxyi
x ii ,,
...Related by the Biliniear Transform
Definition:
Properties:
x
xxB
1
1)(
yx
yx
yxpyxp
yp
ypyxpypyxp
yp
xypxyp
xypxyp
xypxyp
xyp
xypBB
,
,
)|1()|0(
)(2
)()|1(2)()|0(2
)(2
)1|()0|(
)1|()0|(
)1|()0|(
))0|(
)1|(()(
yxyx
yxyx
B
B
xxBB
,,
,,
)(
)(
))((
Variable to Check Messages
On any iteration i, the message from v to c is:
)()1(
'|,',
)(,
i
cvcvcyx
icv mBm
vv
. . . . . .
v
c
Check to Variable Messages
On any iteration, the message from c to v is:
)()(
'|,'
)(,
i
vcvcv
ivc mBm
. . . . . .
v
c
Decision Rule
After sufficiently many iterations, return the likelihood ratio:
otherwise ,1
0)( if ,0ˆ
)1(
|,,
i
vcvcyx mB
xvv
Theorem about MP Algorithm
If the algorithm stops after r iterations, then the algorithm returns the maximum a posteriori probability estimate of xv given y within radius r of v.
However, the variables within a radius r of v must be dependent only by the equations within radius r of v,
v
r
...
...
...
Wiring Complexity
Physical Implementation (VLSI)
We have seen that the MP decoding algorithm for LDPC codes is defined in terms of a graph
Nodes perform local computation Edges carry messages from v to c, and c to v Instantiate this graph on a chip!
Edges →Wires Nodes →Logic units
Complexity vs. Performance
Longer codes provide: More efficient use of
the channel (eg. less power used over the AWGN channel)
Faster throughput for fixed technology and decoding parameters (number of iterations)
Longer codes demand: More logic resources Way more wiring
resources
The Wiring Problem
The number of edges in the graph grows like the number of nodes n.
The length of the edges in a random graph also grows like .
A random graph
n
Graph Construction?
Idea: find a construction that has low wire-length and maintains good performance...
Drawback: it is difficult to construct any graph that has the performance of random graph.
A Better Solution:
Use an algorithm which generates a graph at random, but with a preference for: Short edge length Quantities related to code performance
Conventional Graph Wisdom
Short loops give rise to dependent messages (which are assumed to be independent) after a small number of iterations, and should be avoided.
Simulated Annealing!
Simulated annealing approximately minimizes an Energy Function over a Solution space.
Requires a good way to traverse the solution space.
Generating LDPC graphs with Simulated Annealing
Define energy function with two components:
Wirelength
Loopiness
traverse the space by picking two edges at random and do:
l
llE ||
w
w wE ||
llww EcEcE
Results of Simulated Annealing
The graph on the right has nearly identical performance to the one shown previously
A graph generated bySimulated Annealing
Logic Complexity
Complexity of Classical Algorithm
Original algorithm defines messages in terms of arithmetic operations over real numbers:
However, this implies floating-point addition, multiplication, and even division!
)()1(
|,,
)(,
i
cvcvcyx
icv mBm
vv
A modified Algorithm
We define a modified algorithm in which all messages are their logarithms in the original scheme
The channel message λ is similarly replaced by it's logarithm.
)()1(
|,,
)(,
i
cvcvcyx
icv mBm
vv
cvc
ivcyx mB
vv|
)1(,, )))(exp(log()log(
)log(' )(,
)(,
icv
icv mm
cvc
ivcyx m
vv|
)1(,, )('
Quantization
Replaced a product by a sum, but now we have a transcendental function φ.
However, if we quantize the messages, we can pre-compute φ for all values!
cvc
ivcyx
icv mm
vv|
)1(,,
)(, )(''
Quantized MP Performance
The graph to the following page shows the bit error rate for a regular (3,6) of length n=10,000 code using between 2 and 4 bits of quantization.
(Some error floors predicted by density evolution, some are not)
Conclusion
There is a tradeoff between logic complexity and performance
Nearly optimal performance (+.1 dB = × 1.03 power) is achievable with 4-bit messages.
More work is needed to avoid error-floors due to quantization.