Optimum MDS convolutional codes over GF(2 m ) and their relation to the trace functi n Ángela Barbero and Øyvind Ytrehus UVa, Simula@UiB, UiB 1
Problem setting • Unicast transmission over the Internet – (Memoryless) packet erasure channel, capacity "1 − 𝜁" • Solutions in the Internet: – TCP uses ARQ • Problem: Long round trip time (RTT ) ≈ 100’s ms – The recovery delay of any ARQ system large – Rate loss due to inexact RTT estimation – Delay of recovery • If no delay constraints: ARQ sufficient in many cases • Applications with delay constraints: : Multimedia, IoT control applications, stock market applications, games – Better: Erasure correcting codes 2
Coding criteria • Code rate close to channel capacity??? • (Low) probability of recovery failure – Either decoding failure: erasure pattern covers a codeword – Or recovery delay exceeding tolerance of application • Recovery complexity: Systematic codes? 3
Coding candidates Unsuited for • MDS, Reed-Solomon: Long delay delay • «Rateless» , fountain codes: Long delay sensitive app’s • Convolutional codes: «good» column distance profile – Binary? – q-ary – Flexible rate – Random codes • What does «random» mean?? • Good column distance profile should still apply «Block codes are for boys, convolutional codes are for men» – J. Massey 4
Convolutional codes for dummies Block code: 11 ⋯ 1𝑜 𝑑 = 𝑣𝐻 = 𝑣 1 ⋯ 𝑣 𝑙 ⋮ ⋱ ⋮ 𝑙1 ⋯ 𝑙𝑜 Minimum distance = min 𝑥 𝑑 : 𝑑 ≠ 0 Convolutional code: 𝐻 0 𝐻 1 ⋯ 𝐻 𝑀 𝑑 = 𝑣𝐻 = 𝑣 (0) 𝐻 0 ⋱ ⋮ ⋯ 𝑣 (1) ⋯ 𝐻 0 , 𝑑 (1) = 𝑣 (0) 𝐻 1 + 𝑣 (1) 𝐻 0 , … 𝑑 (0) = 𝑣 (0) 𝐻 0 , 𝑥 𝑑 0 𝑑 1 , ⋯ ∶ 𝑑 (0) ≠ 0 CDP= min 𝑥 𝑑 0 5
Convolutional codes and erasure recovery for dummies If CDP is (2,3,4, … , D ) then an erasure pattern – of weight 𝑘 and – starting at block/time 1 will be recovered at time 𝑘 iff 𝑘 < D 6
Convolutional code approach • H. Gluesing-Luerssen, J. Rosenthal, and R. Smarandache, Strongly-MDS Convolutional Codes , IEEE Trans on IT 52, 2006. – E.Gabidulin, 1989 • 𝑟 -ary convolutional codes with optimum column distance profile – MDS -convolutional codes • J. Rosenthal, and R. Smarandache, 1998 • cdp = (𝑜 − 𝑙 + 1,2(𝑜 − 𝑙) + 1, … , D ), – Existence of MDS code equivalent to existence of superregular matrices – Existing constructions require large field 7
Our convolutional code approach • Systematic • Over GF(2 m ) • High rate 𝑜−1 𝑜 • MDS (CDP=( 2,3,4, … , D, D, D , ...)) 8
Parity-check matrix of a convolutional code H 2 H 1 H 0 9
Parity-check matrix of a convolutional code H (0) 10
Parity-check matrix of a convolutional code H (1) 11
Parity-check matrix of a convolutional code H (2) 12
Generator matrix of a convolutional code G (2) H (2)T =(0) 13
Proper minors and superregularity 14
Proper minors and superregularity 15
Proper minors and superregularity 16
Proper minors and superregularity 17
Proper minors and superregularity 1 0 0 1 0 0 1 0 0 1 1 0 1 1 0 1 1 0 𝛽 2 𝛽 1 0 𝛽 1 𝛽 𝛽 1 18
Our contributions • s -superregularity • Constructions of MDS codes with CDP=(2,3, D = 4) • Efficient algorithm to search for MDS codes with CDP=( 2,3,4,…, D ), D ≥ 5 19
Proper minors and s-superregularity 20
Proper minors and s-superregularity 21
Proper minors and s-superregularity 22
Proper minors and s-superregularity 23
Proper minors and s-superregularity 1 1 0 0 0 0 1 1 0 0 0 0 1 𝛽 1 1 0 0 0 𝛽 1 1 0 0 𝛽 3 1 1 𝛽 1 1 1 𝛽 1 𝛽 1 1 24
Superregularity and CDP Known for k =1: Gluesing-Luerssen et al 2006, Gabidulin 1989 25
Binary superregular matrices? • 1 -superregularity • 1x1: 1 → 2,1 𝑐𝑚𝑝𝑑𝑙 𝑑𝑝𝑒𝑓 • 2x2: 1 1 → 2,1 𝑑𝑝𝑜𝑤. 𝑑𝑝𝑒𝑓, 𝑑𝑒𝑞 = (2,3) 1 • 3x3 not possible 1 → 𝑂𝑃 2,1 𝑑𝑝𝑜𝑤. 𝑑𝑝𝑒𝑓, 𝑑𝑒𝑞 = (2,3,4) 1 1 ? 1 1 26
The problem addressed here 27
The problem addressed here : approach Add coefficients 𝑠 𝑗,𝑘 . How many layers 𝑠 𝑗,1 , … , 𝑠 𝑗,𝑙 can be completed, maintaining the s -superregularity? If the layer 𝑠 𝐸,1 , … , 𝑠 𝐸,𝑙 can be completed, maintaining the superregularity, the corresponding code has column distance 2, 3, … , 𝐸 + 2 28
Previous world records for 𝟑 𝒏 ≥ 𝟓 Justesen & Hughes (1974) Gluesing-Luersen et. al, «Strongly MDS…», 2006 29
New constructions : distance 3 Comparison with Wyner-Ash code: 30
New constructions: distance 4 31
New constructions Example 1: 32
New constructions Proof: 1 1 … 1 0 0 … 0 0 0 … 0 𝐼 ′(2) = 𝑏 1 𝑏 2 … 𝑏 𝑙 1 1 … 1 0 0 … 0 𝑐 1 𝑐 2 … 𝑐 𝑙 𝑏 1 𝑏 2 … 𝑏 𝑙 1 1 … 1 4 33
𝟑 𝒏−𝟐 −𝟐 𝟑 𝒏−𝟐 construction Proof, distance=4, rate= 34
Computer search algorithm 35
Computer search algorithm 36
Computer search algorithm 37
Codes found by computer search Justesen & Hughes (1974) Gluesing-Luersen et. al, «Strongly MDS…», 2006 Implicit 38
Polynomial notation for convolutional codes Example 1: 39
Codes found by computer search 40
Codes found by computer search 41
Rareness 42
Codes found by computer search 43
Codes found by computer search 44
Codes found by computer search 45
Codes found by computer search 46
Codes found by computer search 47
Upper bounds 48
Conclusions 49
Questions? Comments? 50
Recommend
More recommend