li chen phd mieee associate professor school of
play

Li Chen , PhD, MIEEE Associate Professor, School of Information - PowerPoint PPT Presentation

Progressive Algebraic Soft-Decision Decoding of Reed-Solomon Codes Li Chen , PhD, MIEEE Associate Professor, School of Information Science and Technology Sun Yat-sen University, China Joint work with Prof. Xiao Ma


  1. Progressive Algebraic Soft-Decision Decoding of Reed-Solomon Codes Li Chen (陈立) , PhD, MIEEE  Associate Professor, School of Information Science and Technology  Sun Yat-sen University, China  Joint work with Prof. Xiao Ma and Mrs. Siyun Tang  Hangzhou, China  5 th , Nov, 2011 

  2. Outline  Introduction  Progressive ASD (PASD) algorithm  Validity analysis  Complexity analysis  Error-correction performance  Conclusions

  3. I. Introduction of list decoding Decoding philosophy evolution of an ( n , k ) RS code  Unique decoding List decoding      1 n k        ( 1 ) n n k   unique list   2

  4. I. Overview of Enc. & Decd. Encoding  Given a message polynomial: u ( x ) = u 0 + u 1 x + ··· + u k -1 x k- 1 ( u i ∈ GF( q ))  Generate the codeword of an ( n , k ) RS code  ( α i ∈ GF( q )\{0}) c = ( c 0 , c 1 , …, c n -1 ) = ( u ( α 0 ), u ( α 1 ), …, u ( α n -1 )) Decoding  M Π L Factorization Interpolation Reliab. Trans. Π  M Q ( x , p ( x )) Q ( x , y ) Computationally expensive step!

  5. I. Perf. of AHD and ASD Algebraic hard-decision decoding (AHD) [Guruwammi99] (-GS Alg)  Algebraic soft-decision decoding (ASD) [Koetter03] (-KV Alg)  Advantage on error-correction performance 

  6. I. Introduction  Price to pay: decoding complexity

  7. I. Inspirations The algebraic soft decoding is of high complexity. It is mainly due to the iterative  interpolation process ; A modernized thinking – the decoding should be flexible (i.e., channel dependent).  E.g., the belief propagation algorithm high  The current AHD/ASD system Initialisation Complexity Horizontal step Vertical step Hard decision ˆ low c No  ˆ T 0 ? c H bad good Quality of the received word ˆ Yes, c Iterative proc. Incremental comp. Continues validat.

  8. II. A Graphical Introduction l      l  1 2 3 4 5 PASD ( ) 5 ASD ( ) c 1 c 2 c 3 c 1 c 2 c 3 c 5 c 6 c 7 c 5 c 6 c 7 c 4 c 4 r r c 8 c 9 c 10 c 8 c 9 c 10    1 { } l L c 6     2 { , } l L c c { , , , , } L c c c c c 6 9 5 6 7 9 10    3 { , , } l L c c c 6 9 10    4 { , , , } l L c c c c | L | - factorization output list size 5 6 9 10    5 { , , , , } l L c c c c c 5 6 7 9 10 Enlarging the decoding radius progressively  Enlarging the factorization OLS progressively

  9. Ⅱ . Decoding architecture Progressive Progressive Interpolation reliab. trans. Validated by CRC code! v – iteration index; l v -- designed OLS at each iteration; l T -- designed maximal OLS (~the maximal complexity that the system can tolerate); l’ – step size for updating the OLS, l v +1 = l v + l’ ;

  10. II. Progressive approach Enlarge the decoding radius  Enlarge the OLS  Progressive decoding  Π Series of OLS: l 1 , l 2 , ……, l v -1 , l v , ……, l T Series of M mtxs.: M 1 , M 2 , ……, M v -1 , M v , ……, M T Series of Q polys.: Q (1) , Q (2) , ……, Q ( v -1) , Q ( v ) , ……, Q ( T ) Question: Can the solution of Q ( v ) be found based on the knowledge of Q ( v -1) ?

  11. II. Incremental interpolation constraints Multiplicity m ij ~ interpolated point ( x j , α i )  1 Given a polynomial Q ( x , y ), m ij implies constraints of  ( + 1 ) m ij m ij 2 ( r + s < m ij ) Definition 1: Let Λ ( m ij ) denote a set of interpolation constraints ( r , s ) ij indicated by m ij , then  Λ (M) denotes a collection of all the sets Λ ( m ij ) defined by the entry m ij of M   2 0 0   0 1 1 Λ (M) = {(0, 0) 00 , (1, 0) 00 , (0, 1) 00 , (0, 0) 11 ,    Example: M M    1 2 0 (0, 0) 12 , (0, 0) 20 , (0, 0) 21 , (1, 0) 21 ,   (0, 1) 21 , (0, 0) 32 }   0 0 1

  12. II. Incremental Interp. Constr. v -1 and m ij Definition 2: Let m ij v denote the entries of matrix M v -1 and M v , the incremental  interpolation constraints introduced between the matrices are defined as a collection of all the residual sets between Λ ( m ij v ) and Λ ( m ij v -1 ) as:   2 0 0   1 0 0     0 1 1   0 0 1    Example: M 2 M   M 1 M   2 1 2 0   1 1 1 0      0 0 1    0 0 1 Λ (M 2 ) = {(0, 0) 00 , (1, 0) 00 , (0, 1) 00 , Λ (M 1 ) = {(0, 0) 00 , (0, 0) 12 , (0, 0) 11 , (0, 0) 12 , (0, 0) 20 , (0, 0) 20 , (0, 0) 21 , (0, 0) 32 } (0, 0) 21 , (1, 0) 21 , (0, 1) 21 , (0, 0) 32 } 5 constraints 10 constraints Λ ( Δ M 2 ) = {(1, 0) 00 , (0, 1) 00 , (0, 0) 11 , 5 constraints (1, 0) 21 , (0, 1) 21 }

  13. II. Progressive Interpolation A big chunk of interpolation task Λ ( Μ T ) can be sliced into smaller pieces.  Λ ( ΔΜ 1 ) Λ ( ΔΜ 2 ) Λ ( ΔΜ 3 ) Λ ( Μ T ) ... Λ ( ΔΜ v ) ... Λ ( ΔΜ T )  Λ ( ΔΜ v ) defines the interpolation task of iteration v. 

  14. II. Incremental Computations Review on the interpolation process – iterative polynomial construction  Given M v , the interpolation constraints are Λ ( Μ v )  The polynomial group is: G v = { g 0 , g 1 , …, g v }, (deg y g v = l v )  The iterative process  defines the outcome of the updated group. For ( r , s ) ij ∈ Λ ( Μ v ) For each g t ∈ G v Finally, Q ( v ) = min{ g t | g t ∈ G v }

  15. II. Incremental Computations From iteration v -1 to v …  The progressive interpolation can be seen as a progressive polynomial  group expansion which consists of two successive stages. Let be the outcome of iteration v -1.  During the generation of , a series of with ( r , s ) ij ∈ Λ ( Μ v -1 ) are  identified and stored. Expansion I: expand the number of polynomials of the group  Polynomials of Δ G v perform interpolation w.r.t. constraints of Λ ( Μ v -1 );  Polynomials are re-used for the update of Δ G v .  Let be the updated outcome of Δ G v and 

  16. II. Incremental Computations Expansion II: expand the size of polynomials of the group  Polynomials of will now perform interpolation w.r.t. the incremental  constraints Λ ( ΔΜ v ), yielding . Finally,   Visualize the polynomial group expansion Expansion I  E.g., l v = 5 g 1 g 2 g 3 g 4 g 5 g 6 g 1 g 2 g 3 g 4 g 5 g 6 Expansion II ASD PASD

  17. II. Progressive Interpolation The process of progressive interpolation  M 1 , M 2 , M 3 , … , M v -1 , M v , …, M T -1 , M T  Λ (M 1 ), Λ (M 2 ), Λ (M 3 ), …, Λ (M v -1 ), Λ (M v ), …, Λ (M T -1 ), Λ (M T ) Λ ( Δ M T ) Λ ( Δ M v ) Λ ( Δ M 2 ) Λ ( Δ M 3 ) G 1 G 2 G 3 …… G v -1 G v G T -1 G T Δ G 3 Δ G v Δ G T Δ G 2 Q (1) Q (2) Q (3) Q ( v -1) Q ( v ) Q ( T -1) Q ( T ) If Q ( v ) ( x , u ( x )) = 0, the decoding will be terminated. Multiple factorizations are carried out in order to determine whether u ( x ) has been found! 

  18. III. Validity Analysis For any two ( r 1 , s 1 ) i 1 j 1 and ( r 2 , s 2 ) i 2 j 2 of Λ ( Μ T ):  ( r 1 , s 1 ) i 1 j 1 ( r 2 , s 2 ) i 2 j 2 <==> ( r 2 , s 2 ) i 2 j 2 ( r 1 , s 1 ) i 1 j 1 The algorithm imposes a progressive interpolation order  Λ ( ΔΜ 1 ) Λ ( ΔΜ 2 ) Λ ( ΔΜ 3 ) Λ ( Μ T ) ... Λ ( ΔΜ v ) ... Λ ( ΔΜ T ) The ( r , s ) ij to be satisfied. The satisfied ( r , s ) ij .

  19. III. Validity Analysis Decoding with an OLS of l v , the solution Q ( x , y ) is seen as the minimal  candidate chosen from the cumulative kernel For both of the algorithms:  the same set of constraints are defined for the cumulative kernel; Consequently, they will offer the same solution of Q ( x , y ). 

  20. III. Validity Analysis In the end of Expansion I ,  Can and be found separately?  Recall the polynomial updating rules  • The minimal polynomial defines the solution of one round of poly. update w.r.t. ( r , s ) ij . • If such a group expansion procedure does not change the identity of , and can indeed be found separately.

  21. III. Validity Analysis  ( < ) Expansion I: update to w.r.t. constraint of  No update is required For is in memory, Since < g * (of Δ G v ) Update is required will be re-used is not in memory, it will be picked up from Δ G v. The identity of the existing is left unchanged. Consequently, the  solution of remains intact. Therefore, 

  22. IV. Complexity Analysis Average decoding complexity – average number of finite field arithmetic  operation for decoding one codeword frame; --- the probability of the decoder is performing a successful decoding  with an OLS of l v ; --- the decoding complexity with an OLS of l v ; The average decoding complexity is:  Decoding succ. Decoding fail. is now channel dependent! 

  23. IV. Complexity Analysis  where consists of and  since

  24. IV. Complexity Analysis  The decoding complexity increases exponentially with the OLS (dominant  exponential factor of 5); The decoding complexity is quadratic in the dimension of the code k. The decoding  complexity will be smaller for a low rate code.

Recommend


More recommend