information theory
play

Information Theory Lecture 8 BCH codes BCH codes: R8.45 (R5.6) - PDF document

Information Theory Lecture 8 BCH codes BCH codes: R8.45 (R5.6) Decoding BCH (and RS) codes: R6 Reed-Solomon codes RS codes: R5.13 Mikael Skoglund, Information Theory 1/15 The BCH Bound Theorem : Let C be cyclic of


  1. Information Theory Lecture 8 • BCH codes • BCH codes: R8.4–5 (R5.6) • Decoding BCH (and RS) codes: R6 • Reed-Solomon codes • RS codes: R5.1–3 Mikael Skoglund, Information Theory 1/15 The BCH Bound • Theorem : Let C be cyclic of length n with generator polynomial g ( x ) over GF( q ) . Let m be the smallest integer such that n | q m − 1 and let α ∈ GF( q m ) be a primitive n th root of unity. Then, if for some integers b ≥ 0 and δ ≥ 2 all the elements α b , α b +1 , . . . , α b + δ − 2 in GF( q m ) are zeros of the code, it holds that d min ≥ δ . δ − 1 consecutive zeros ⇒ d min ≥ δ Mikael Skoglund, Information Theory 2/15

  2. BCH Codes • Definition : Consider a cyclic code C of length n over GF( q ) , let m be the smallest integer such that n | q m − 1 and let α ∈ GF( q m ) be a primitive n th root of unity. Then C is a BCH code of designed distance δ if for some b ≥ 0 it has generator polynomial g ( x ) = lcm { p ( b ) ( x ) p ( b +1) ( x ) p ( b + δ − 2) ( x ) } • A BCH code is said to be • narrow sense if b = 1 • primitive if n = q m − 1 ( = ⇒ α primitive in GF( q m ) ) • Theorem : A BCH code over GF( q ) of length n and designed distance δ has d min ≥ δ and dimension k ≥ n − m ( δ − 1) . Mikael Skoglund, Information Theory 3/15 • In the special case q = 2 , b = 1 and δ = 2 τ + 1 , it holds that r = n − k ≤ mτ (since the p ( i ) ( x ) ’s have degree ≤ m , and p (2 i ) ( x ) = p ( i ) ( x ) ) • True minimum distance d min : • For q = 2 , b = 1 , n = 2 m − 1 and δ = 2 τ + 1 the code has d min = 2 τ + 1 if t +1 � n � � > 2 mt i i =0 • If b = 1 and n = δp for some p , then d min = δ • If b = 1 , n = q m − 1 and δ = q p − 1 for some p then, d min = δ • If n = q m − 1 then d min ≤ qδ − 1 Mikael Skoglund, Information Theory 4/15

  3. Parity Check Matrix • Assume narrow sense and primitive over GF(2) and δ = 2 τ +1 • Since g ( α i ) = 0 for i = 1 , . . . , δ − 1 , a valid parity check matrix is α 2 α n − 1   1 α · · · α 3 ( α 3 ) 2 ( α 3 ) n − 1 1 · · ·    α 5 ( α 5 ) 2 ( α 5 ) n − 1  1 · · · H BCH =    . .  . .   . · · · .   α δ − 2 ( α δ − 2 ) 2 ( α δ − 2 ) n − 1 1 · · · • That is, the second column = lowest-degree α i ’s that correspond to different minimal polynomials • To get the binary version: replace the α i ’s with the column vectors from GF m (2) that represent the coefficients of the polynomial α i ∈ GF(2 m ) • Gives mτ binary rows, if mτ > r reduce to get linearly independent rows Mikael Skoglund, Information Theory 5/15 Examples • Binary Hamming code : Narrow sense and primitive binary BCH code with n = 2 m − 1 , for some m ≥ 1 , and g ( x ) = a primitive polynomial in GF(2 m ) . Designed distance δ = 3 = true d min • Hamming code over GF( q ) : A narrow sense and primitive BCH code, with m smallest integer such that n | q m − 1 , m and q − 1 relatively prime, and g ( x ) = primitive polynomial in GF( q m ) . Designed distance δ = 3 = true d min • Narrow sense and primitive binary BCH code with δ = 5 : Let n = 2 m − 1 and α primitive in GF(2 m ) . With g ( x ) = p (1) ( x ) p (3) ( x ) we get δ = 5 . E.g., n = 15 = ⇒ g ( x ) = (1 + x + x 4 )(1 + x + x 2 + x 3 + x 4 ) For this code, n = 3 · 5 = ⇒ d min = δ = 5 . Mikael Skoglund, Information Theory 6/15

  4. BCH Codes Cannot Achieve Capacity • Theorem : There does not exist a sequence of [ n, k, d ] primitive BCH codes over GF( q ) with both d/n and k/n bounded away from zero as n → ∞ . Mikael Skoglund, Information Theory 7/15 Decoding Binary BCH Codes • Let C be a narrow-sense and primitive [ n, k, d ] BCH code over GF(2) of designed distance δ = 2 τ + 1 . • Let α ∈ GF(2 m ) be a primitive n th root of unity, with m the smallest integer such that n | 2 m − 1 • Assume a codeword c = ( c 0 , . . . , c n − 1 ) ∈ C is transmitted over a binary (memoryless) channel, resulting in y = ( y 0 , . . . , y n − 1 ) = c + e with e = ( e 0 , . . . , e n − 1 ) ∈ GF n (2) of weight w • Polynomials: n − 1 n − 1 n − 1 � � � c m x m , y ( x ) = y m x m , e ( x ) = e m x m c ( x ) = m =0 m =0 m =0 Mikael Skoglund, Information Theory 8/15

  5. • The error locator polynomial Λ( x ) : Assume that the non-zero components of e are e i 1 , . . . , e i w , and let w w � � Λ r z r Λ( z ) = (1 − X r z ) = 1 + r =1 r =1 where X r = α i r are the error locators • Roots of Λ( z ) in GF(2 m ) known = ⇒ e known • Decoding : 1 Compute A i = y ( α i ) , i = 1 , . . . , δ − 1 2 Find Λ( z ) from A 1 , . . . , A δ − 1 3 Compute the roots of Λ( z ) → e ( x ) • Will correct all errors of weight w ≤ τ • Polynomial (not exponential) complexity! Mikael Skoglund, Information Theory 9/15 • Compute A i = y ( α i ) , i = 1 , . . . , δ − 1 : • Divide y ( x ) by the minimal polynomial p ( i ) ( x ) of α i , y ( x ) = q ( x ) p ( i ) ( x ) + r ( x ) , and set x = α i in the remainder r ( x ) , A i = y ( α i ) = r ( α i ) • Equivalent to computing the syndrome : with H on the form H BCH we get       y ( α ) e ( α ) A 1 y ( α 3 ) e ( α 3 ) A 3       s = Hy T = He T =  =  =  .   .   .  . . .       . . .     y ( α δ − 2 ) e ( α δ − 2 ) A δ − 2 and then we can get A 2 = A 2 1 , A 4 = A 2 2 , . . . , A δ − 1 = A 2 ( δ − 1) / 2 Mikael Skoglund, Information Theory 10/15

  6. • Compute Λ( z ) from A i , i = 1 , . . . , δ − 1 : • Newton’s identities (tailored to this problem):       1 0 0 0 0 · · · 0 Λ 1 A 1 A 2 A 1 1 0 0 · · · 0 Λ 2 A 3       A 4 A 3 A 2 A 1 1 · · · 0 Λ 3 A 5             . . . . = .  . . .   .   .  . . . . .             A 2 w − 4 A 2 w − 5 · · · · · · A w − 3 Λ w − 1 A 2 w − 3       A 2 w − 2 A 2 w − 3 · · · · · · A w − 1 Λ w A 2 w − 1 as long as w ≤ τ = ( δ − 1) / 2 • { A i } → Λ( z ) not unique = ⇒ choose Λ( z ) of lowest degree • Not feasible for large τ ’s = ⇒ use instead the Berlekamp–Massey algorithm to find Λ( z ) . . . Mikael Skoglund, Information Theory 11/15 • Find the roots of Λ( z ) : ⇒ Λ( α − i ) = 0 ; • An error in coordinate i ⇐ • simply test Λ( α − i ) = 0 for i = 1 , . . . , n (Chien search) • Nonbinary BCH codes: Same principles apply, R6 describes the general approach. . . • More than τ errors: The method described only works for ≤ τ = ( δ − 1) / 2 errors, i.e., full nearest neighbor decoding is not implemented; • Complete NN decoding algorithms (of polynomial complexity) known in many cases, but need often be tailored to specific codes. . . • The list decoding approach: see R9 • Full search NN decoding always possible, but has exponential complexity. . . Mikael Skoglund, Information Theory 12/15

  7. Reed–Solomon Codes • Definition : A Reed–Solomon (RS) code over GF( q ) is a BCH code of length N = q − 1 , that is, g ( x ) = ( x − α b )( x − α b +1 ) · · · ( x − α b + δ − 2 ) for some b ≥ 0 and δ ≥ 2 , and with α primitive ∈ GF( q ) • Zeros and symbols in the same field , GF( q ) • Dimension K = N − δ + 1 • The Singleton bound d min ≤ N − K + 1 = ⇒ • d min = δ • maximum distance separable code Mikael Skoglund, Information Theory 13/15 Encoding RS Codes • RS codes are cyclic : Encode as (non-binary) cyclic codes. . . • Alternative : Assume an [ N, K ] RS code, and let u ( x ) = u 0 + u 1 x + · · · + u K − 1 x K − 1 correspond to the message symbols u 0 , . . . , u K − 1 ∈ GF( q ) , then c ( x ) = u (1) + u ( α ) x + u ( α 2 ) x 2 + · · · + u ( α N − 1 ) x N − 1 is a codeword. Mikael Skoglund, Information Theory 14/15

  8. Decoding RS Codes • RS codes are BCH codes : Decode as non-binary BCH codes. . . • Alternative list decoding: See R9. . . Mikael Skoglund, Information Theory 15/15

Recommend


More recommend