shuffled belief propagation decoding
play

Shuffled Belief Propagation Decoding Juntan Zhang and Marc - PowerPoint PPT Presentation

Shuffled Belief Propagation Decoding Juntan Zhang and Marc Fossorier Department of Electrical Engineering University of Hawaii at Manoa Honolulu, HI 96816 Outline Review of LDPC Codes Standard Belief Propagation Algorithm Shuffled


  1. Shuffled Belief Propagation Decoding Juntan Zhang and Marc Fossorier Department of Electrical Engineering University of Hawaii at Manoa Honolulu, HI 96816

  2. Outline � Review of LDPC Codes � Standard Belief Propagation Algorithm � Shuffled Belief Propagation Algorithm � Optimality and Convergence � Parallel Shuffled Belief Propagation � A Small Example � Simulation Results � Conclusion

  3. Low-Density Parity Check (LDPC) Codes First proposed by R. G. Gallager in 1960 ’ s, and resurrected � recently [Gallager-IRE62, MacKay-IT99] . Can achieve near Shannon limit performance with belief � propagation (BP) or sum-product algorithm [Richardson- Urbanke-IT01] . Advantages over turbo codes: � better distance properties; parallel decoding structure for high speed decoders. Disadvantages: � encoding complexity is high; decoder complexity is high for full parallel implementation.

  4. Representations of LDPC Codes bipartite graph parity check matrix   1 0 1 0 0 1 0  L L L L    0 0 0 0 1 0 0 O       1 0 0 0 0 0 1 O    = H 0 0 0 0 0 0 0 M  O      M M M O M M M M     M M M L L L L M M M M      1 0 0 0 0 0 1 L L L L    Check nodes Bit(variable) N nodes

  5. Regular and Irregular LDPC Codes An LDPC code is regular if H has constant row weight and column � weight, or equivalently, the check nodes have constant degree d c and variable nodes have constant degree d v ; An LDPC code is irregular if row and column weights are not � constants. Irregular LDPC codes are defined by degree distributions � Long irregular LDPC codes have better performance than regular � LDPC codes, and can beat turbo codes [Richardson-Urbanke-IT01] .

  6. Geometric LDPC Codes Originally studied for majority logic decoding decades ago, and � constructed based on finite geometries (Euclidean and projective geometries) [Weldon-Bell66, Rudolph-IT67]; BP algorithm can be applied to the decoding of this family of � codes [Lucas-Fossorier-Kou-Lin-COM00, Kou-Lin-Fossorier-IT01]; Encoding can be easily implemented with shift registers since � they are cyclic codes; They have very good minimum distance properties; � Decoding complexity is high. �

  7. An example: (7, 3) DSC code:   1 0 1 1 0 0 0   0 1 0 1 1 0 0     0 0 1 0 1 1 0   = H 0 0 0 1 0 1 1     1 0 0 0 1 0 1   1 1 0 0 0 1 0     0 1 1 0 0 0 1   � Equal number of bit nodes � Parity matrix is Squared ! and check nodes. � Not full rank. � Node degrees are larger.

  8. Some one-step majority decodable codes: PG-LDPC codes (DSC) EG-LDPC codes rate ( N , K ) d min rate ( N , K ) d min (7, 3) 0.429 4 (15, 7) 0.467 5 (21, 11) 0.524 6 (63, 37) 0.587 9 (73, 45) 0.616 10 (255, 175) 0.686 17 (273, 191) 0.700 18 (1023, 781) 0.763 33 (1057, 813) 0.769 34 (4095, 3367) 0.822 65 (4161, 3431) 0.825 66

  9. Processing in check nodes: Principles: incoming messages + constraints ⇒ outgoing messages z z mn mn 1 1 L z mn mn 1 2 L L Check Node mn mn 2 2 m L mn   3 z z ( )  ∏   L − = 1 L 2 tanh tanh z 2 mn mn 3 3 mn  ′ 4 mn m n   ′ ∈ n N ( m ) \ n z z mn mn Bit Nodes 4 4 N(m)

  10. Processing in bit nodes: ∑ L = + z F L m n 1 ′ mn n m n ′ ∈ L m M ( n ) \ m z m n 2 m n z 1 Bit Node m n 2 n z L ∑ m n = + 3 m n z F L , 3 z for hard decision n n mn m n 4 ∈ m M ( n ) F n L m n 4 Check Nodes M(n)

  11. Standard IDBP Initialization � Step1: Update the Belief Matrix � Horizontal Step: Update the whole Check-to-Bit Matrix � Vertical Step: Update the whole Bit-to-Check Matrix � Step2: Hard decision and stopping test � Step3: Output the decoded codeword �

  12. Standard BP Algorithm; Step 1: ∏ ( i ) Horizontal Step : − + ( i 1 ) 1 tanh( z / 2 ) mn ' ∈ n ' N ( m ) \ n ε = ( i ) log ∏ mn − − ( i 1 ) 1 tanh( z / 2 ) mn ' ∈ n ' N ( m ) \ n ( ii )Vertical Step: ∑ = + ε ( i ) ( i ) z F mn n m ' n ∈ m ' M ( n ) \ m ∑ = + ε ( i ) ( i ) z F n n mn ∈ m M ( n )

  13. Update the belief matrix in i-th iteration with standard belief propagation decoding ε ε − ε ε − ε ε − − − − ( ( i i ) 1 ) ( ( i i ) 1 ) ( ( i i ) 1 ) ( ( i i ) 1 ) ( i 1 ) ( ( i i ) 1 ) z z ( i ) z z z z 00 00 01 01 03 00 01 03 03 00 01 03 − − − − − − ε ε ( i ) ε ε ε ( i 1 ) ε ( i 1 ) ( ( i i ) 1 ) ( ( i i ) 1 ) ( ( i i ) 1 ) ( ( i i ) 1 ) ( i ) z z z z z z 11 11 12 14 11 12 14 14 12 14 11 12 − − − − − − ε ε ε ε ε ε ( ( i i ) 1 ) ( ( i i ) 1 ) ( ( i i ) 1 ) ( ( i i ) 1 ) ( ( i i ) 1 ) ( ( i i ) 1 ) z z z z z z 22 23 23 25 25 22 22 23 23 25 25 22 − − − − − − ε ε ( i 1 ) ε ε ( i 1 ) ε ε ( i 1 ) ( i 1 ) ( i 1 ) ( i 1 ) ( i ) ( i ) ( i ) ( i ) ( i ) ( i ) z z z z z z 33 33 34 34 36 36 33 33 34 34 36 36 ε ε − ε − ε − − − − ( ( i i ) 1 ) ε ( ( i i ) 1 ) ε ( ( i i ) 1 ) ( i 1 ) ( ( i i ) 1 ) ( ( i i 1 ) ) ( i ) z z z z z z 40 40 44 44 45 45 40 40 44 44 45 45 ε ε − ε ε − ε ε − − − − ( ( i i ) 1 ) ( ( i i ) 1 ) ( ( i i ) 1 ) ( i 1 ) ( ( i i 1 ) ) ( ( i i ) 1 ) ( i ) z z z z z z 51 51 55 56 51 55 56 55 56 51 55 56 − − − − − − ε ε ε ε ε ( ( i i ) 1 ) ε ( ( i i ) 1 ) ( ( i i ) 1 ) ( ( i i ) 1 ) ( ( i i ) 1 ) ( ( i i ) 1 ) z z z z z z 60 60 62 66 66 60 60 62 62 66 66 62

  14. Shuffled Belief Propagation � Initialization � Step1: Update the Belief Matrix, for n=0,..N-1 � Horizontal Step: Update the n-th column of Check-to-Bit Matrix � Vertical Step: Update the n-th column of Bit-to-Check Matrix � Step2: Hard decision and stopping test � Step3: Output the decoded codeword

  15. Shuffled Belief Propagation; Step 1: ( i ) Horizontal Step: ∏ ∏ + − ( i ) ( i 1 ) 1 tanh( z / 2 ) tanh( z / 2 ) mn ' mn ' ∈ ∈ n ' N ( m ) \ n n ' N ( m ) \ n < > n ' n n ' n ε = ( i ) log ∏ ∏ mn − − ( i ) ( i 1 ) 1 tanh( z / 2 ) tanh( z / 2 ) mn ' mn ' ∈ ∈ n ' N ( m ) \ n n ' N ( m ) \ n < > n ' n n ' n ( ii )Vertical Step: ∑ = + ε ( i ) ( i ) z F mn n m ' n ∈ m ' M ( n ) \ m ∑ = + ε ( i ) ( i ) z F n n mn ∈ m M ( n )

  16. Update the belief matrix in i-th iteration with shuffled belief propagation decoding ε ε − ε ε − ε ε − − − − ( ( i i ) 1 ) ( ( i i ) 1 ) ( ( i i ) 1 ) ( ( i i 1 ) ) ( i 1 ) ( ( i i ) 1 ) z z ( i ) z z z z 00 00 01 01 03 00 01 03 03 00 01 03 − − − − − − ε ε ( i ) ε ε ε ( i 1 ) ε ( i 1 ) ( ( i i ) 1 ) ( ( i i ) 1 ) ( ( i i ) 1 ) ( ( i i ) 1 ) ( i ) z z z z z z 11 11 12 14 11 12 14 12 14 11 12 14 − − − − − − ε ε ε ε ε ε ( ( i i ) 1 ) ( ( i i ) 1 ) ( ( i i ) 1 ) ( ( i i ) 1 ) ( ( i i ) 1 ) ( ( i i ) 1 ) z z z z z z 22 23 23 25 25 22 22 23 23 25 25 22 − − − − − − ε ε ( i 1 ) ε ε ( i 1 ) ε ε ( i 1 ) ( i 1 ) ( i 1 ) ( i 1 ) ( i ) ( i ) ( i ) ( i ) ( i ) ( i ) z z z z z z 33 33 34 34 36 36 33 33 34 34 36 36 ε ε − ε − ε − − − − ( ( i i ) 1 ) ε ( ( i i ) 1 ) ε ( ( i i ) 1 ) ( i 1 ) ( ( i i ) 1 ) ( ( i i 1 ) ) ( i ) z z z z z z 40 40 44 44 45 45 40 40 44 44 45 45 ε ε − ε ε − ε ε − − − − ( ( i i ) 1 ) ( ( i i ) 1 ) ( ( i i ) 1 ) ( i 1 ) ( ( i i ) 1 ) ( ( i i ) 1 ) ( i ) z z z z z z 51 51 55 56 51 55 56 55 56 51 55 56 − − − − − − ε ε ε ε ε ( ( i i ) 1 ) ε ( ( i i ) 1 ) ( ( i i ) 1 ) ( ( i i 1 ) ) ( ( i i ) 1 ) ( ( i i ) 1 ) z z z z z z 60 60 62 66 66 60 60 62 62 66 66 62

  17. Implementation of shuffled BP � Backward-forward implementation � Computation Complexity

  18. Optimality and Convergence Property of Shuffled BP Given the Tanner Graph of the code is connected and acyclic. � Shuffled BP is optimal in the sense of MAP � Shuffled BP converges faster (or at least no slower) than Standard BP

  19. Parallel Shuffled BP � Divide the N bits into G groups, each group contains bits. (regular N G partition) . � In each group, the updatings are processed in parallel. The processings of groups are in sequential.

Recommend


More recommend