decoding challenge
play

Decoding challenge Assessing the practical hardness of syndrome - PowerPoint PPT Presentation

Decoding challenge Assessing the practical hardness of syndrome decoding for code-based cryptography Matthieu Lequesne Sorbonne Universit Inria Paris, team Cosmiq February 27, 2020 All you ever wanted to know about code-based crypto


  1. Binary Syndrome Decoding Problem F rom now on, we focus on the binary case q = 2. n ✛ ✲ ✻ s = H = n − k ❄ e = Hamming weight w F ind w columns of H adding to s The next slides of this section are reproduced from Nicolas Sendrier’s MOOC “Code Based Cryptography” with his authorization. 12 39

  2. Number of solutions F ix n and k , let w grow: ✲ w 0 13 39

  3. Number of solutions F ix n and k , let w grow: � n � ✲ w 2 n − k solutions on average ✲ w 0 13 39

  4. Number of solutions F ix n and k , let w grow: � n � ✛ ✲ w at most one solution 2 n − k solutions on average ✲ w 0 13 39

  5. Number of solutions F ix n and k , let w grow: � n � ✛ ✲ w at most one solution 2 n − k solutions on average ✲ w 0 d GV � n △ � = 2 n − k . d GV = Gilbert-Varshamov radius, s.t. d GV 13 39

  6. Number of solutions F ix n and k , let w grow: � n � exactly ✭✭✭✭ ✛ ✲ w at most one solution 2 n − k solutions on average ✲ w 0 d GV � n △ � = 2 n − k . d GV = Gilbert-Varshamov radius, s.t. d GV In cryptanalysis, we only consider situations where there is a solution. 13 39

  7. Number of solutions F ix n and k , let w grow: � n � exactly ✭✭✭✭ ✛ ✲ w at most one solution 2 n − k solutions on average ✲ w 0 d GV � n △ � = 2 n − k . d GV = Gilbert-Varshamov radius, s.t. d GV In cryptanalysis, we only consider situations where there is a solution. � n � � / 2 n − k � We expect ≈ max 1 , solutions. w 13 39

  8. Exhaustive search n ✛ ✲ Problem: ✻ find w columns of H s = H = h 1 h 2 h n n − k · · · adding to s (modulo 2) ❄ 14 39

  9. Exhaustive search n ✛ ✲ Problem: ✻ find w columns of H s = H = h 1 h 2 h n n − k · · · adding to s (modulo 2) ❄ Enumerate all w -tuples ( j 1 , j 2 , · · · , j w ) such that 1 ≤ j 1 < j 2 < . . . < j w ≤ n . Check whether s + h j 1 + h j 2 · · · + h j w = 0 . 14 39

  10. Exhaustive search n ✛ ✲ Problem: ✻ find w columns of H s = H = h 1 h 2 h n n − k · · · adding to s (modulo 2) ❄ Enumerate all w -tuples ( j 1 , j 2 , · · · , j w ) such that 1 ≤ j 1 < j 2 < . . . < j w ≤ n . Check whether s + h j 1 + h j 2 · · · + h j w = 0 . � n � Cost: about column operations. w Remark: we obtain all solutions. 14 39

  11. Birthday algorithm n ✛ ✲ Problem: ✻ find w columns of H s = H 1 H 2 H = n − k adding to s (modulo 2) ❄ 15 39

  12. Birthday algorithm n ✛ ✲ Problem: ✻ find w columns of H s = H 1 H 2 H = n − k adding to s (modulo 2) ❄ Idea: Split H into two equal parts and enumerate the two following sets 1 , | e 1 | = w 2 , | e 2 | = w � � � � e 1 H T s + e 2 H T and L 2 = L 1 = 2 2 If L 1 ∩ L 2 � = ∅ , we have solution(s): s + e 1 H T 1 + e 2 H T 2 = 0 15 39

  13. Birthday algorithm n ✛ ✲ Problem: ✻ find w columns of H s = H 1 H 2 H = n − k adding to s (modulo 2) ❄ Idea: Split H into two equal parts and enumerate the two following sets 1 , | e 1 | = w 2 , | e 2 | = w � � � � e 1 H T s + e 2 H T and L 2 = L 1 = 2 2 If L 1 ∩ L 2 � = ∅ , we have solution(s): s + e 1 H T 1 + e 2 H T 2 = 0 Cost: Requires about 2 L + L 2 / 2 n − k column operations, where L = |L 1 | = |L 2 | 15 39

  14. Birthday algorithm � e 1 H T 1 | | ( | e 1 ) = w � � s + e 2 H T 2 | | ( | e 2 ) = w � C ompute L 1 ∩ L 2 = ∩ 2 2 n ✛ ✲ ✻ H 1 H 2 s H = n − k ❄ 16 39

  15. Birthday algorithm � e 1 H T 1 | | ( | e 1 ) = w � � s + e 2 H T 2 | | ( | e 2 ) = w � C ompute L 1 ∩ L 2 = ∩ 2 2 n ✛ ✲ for all e 1 of weight w / 2 ✻ x ← e 1 H T 1 ; T [ x ] ← T [ x ] ∪ { e 1 } H 1 H 2 s H = n − k ❄ � n / 2 � Total cost: w / 2 |L 1 | 16 39

  16. Birthday algorithm � e 1 H T 1 | | ( | e 1 ) = w � � s + e 2 H T 2 | | ( | e 2 ) = w � C ompute L 1 ∩ L 2 = ∩ 2 2 n ✛ ✲ for all e 1 of weight w / 2 ✻ x ← e 1 H T 1 ; T [ x ] ← T [ x ] ∪ { e 1 } H 1 H 2 s for all e 2 of weight w / 2 H = n − k x ← s + e 2 H T 2 ❄ � n / 2 � n / 2 � � Total cost: + w / 2 w / 2 |L 1 | |L 2 | 16 39

  17. Birthday algorithm � e 1 H T 1 | | ( | e 1 ) = w � � s + e 2 H T 2 | | ( | e 2 ) = w � C ompute L 1 ∩ L 2 = ∩ 2 2 n ✛ ✲ for all e 1 of weight w / 2 ✻ x ← e 1 H T 1 ; T [ x ] ← T [ x ] ∪ { e 1 } H 1 H 2 s for all e 2 of weight w / 2 H = n − k x ← s + e 2 H T 2 ❄ for all e 1 ∈ T [ x ] I ← I ∪ { ( e 1 , e 2 ) } � 2 � n / 2 � n / 2 � n / 2 w / 2 � � Total cost: + + w / 2 w / 2 2 n − k |L 1 |·|L 2 | |L 1 | |L 2 | 2 n − k 16 39

  18. Birthday algorithm � e 1 H T 1 | | ( | e 1 ) = w � � s + e 2 H T 2 | | ( | e 2 ) = w � C ompute L 1 ∩ L 2 = ∩ 2 2 n ✛ ✲ for all e 1 of weight w / 2 ✻ x ← e 1 H T 1 ; T [ x ] ← T [ x ] ∪ { e 1 } H 1 H 2 s for all e 2 of weight w / 2 H = n − k x ← s + e 2 H T 2 ❄ for all e 1 ∈ T [ x ] I ← I ∪ { ( e 1 , e 2 ) } � 2 � n / 2 � n / 2 � n / 2 w / 2 return I � � Total cost: + + w / 2 w / 2 2 n − k |L 1 |·|L 2 | |L 1 | |L 2 | 2 n − k 16 39

  19. Birthday algorithm One particular error of Hamming weight w splits evenly with probability � 2 � n / 2 w / 2 P = � n � w 17 39

  20. Birthday algorithm One particular error of Hamming weight w splits evenly with probability � 2 � n / 2 w / 2 P = � n � w We may have to repeat with H divided in several different ways or more generally by picking the two halves randomly 17 39

  21. Birthday algorithm One particular error of Hamming weight w splits evenly with probability � 2 � n / 2 w / 2 P = � n � w We may have to repeat with H divided in several different ways or more generally by picking the two halves randomly ��� n �� Repeat 1 / P times to get most solutions. C ost: O . w 17 39

  22. The power of linear algebra Until here, we have not used linear algebra! 18 39

  23. The power of linear algebra Until here, we have not used linear algebra! For any invertible U ∈ { 0 , 1 } ( n − k ) × ( n − k ) and any permutation matrix P ∈ { 0 , 1 } n × n  H ′ UHP ← � � � e ′ H ′ T = s ′ �  eH T = s where s ′ sU T ⇔ ← e ′ eP .  ← 18 39

  24. The power of linear algebra Until here, we have not used linear algebra! For any invertible U ∈ { 0 , 1 } ( n − k ) × ( n − k ) and any permutation matrix P ∈ { 0 , 1 } n × n  H ′ UHP ← � � � e ′ H ′ T = s ′ �  eH T = s where s ′ sU T ⇔ ← e ′ eP .  ← e ′ H ′ T ( eP )( UHP ) T Proof: = ( eP ) P T H T U T = eH T U T = sU T = s ′ . = 18 39

  25. Prange’s algorithm Idea: P erform a Gaussian Elimination and hope that all the errors are in positions corresponding to the identity part! 1 ❅ H ′ = UHP = and s ′ = sU T = ❅ ❅ 1 19 39

  26. Prange’s algorithm Idea: P erform a Gaussian Elimination and hope that all the errors are in positions corresponding to the identity part! 1 ❅ H ′ = UHP = and s ′ = sU T = ❅ ❅ 1 � �� � n − k ❍ ❨ ❍ possible if the first n − k columns ❍ of HP are independent 19 39

  27. Prange’s algorithm Idea: P erform a Gaussian Elimination and hope that all the errors are in positions corresponding to the identity part! 1 ❅ H ′ = UHP = and s ′ = sU T = ❅ ❅ 1 0 0 e ′ = eP = weight w 19 39

  28. Prange’s algorithm Idea: P erform a Gaussian Elimination and hope that all the errors are in positions corresponding to the identity part! 1 ❅ H ′ = UHP = and s ′ = sU T = ❅ ❅ 1 0 0 e ′ = eP = s ′ 19 39

  29. Prange’s algorithm REPEAT: 1- Pick a permutation matrix P 20 39

  30. Prange’s algorithm REPEAT: 1- Pick a permutation matrix P 1 ❅ 2- Compute UHP = ❅ ❅ 1 20 39

  31. Prange’s algorithm REPEAT: 1- Pick a permutation matrix P 1 ❅ 2- Compute UHP = ❅ ❅ 1 3- If wt ( sU T ) = w then return ( sU T , 0 ) P − 1 20 39

  32. Prange’s algorithm REPEAT: 1- Pick a permutation matrix P 1 ❅ 2- Compute UHP = ❅ ❅ 1 3- If wt ( sU T ) = w then return ( sU T , 0 ) P − 1 C ost of one iteration: K = n ( n − k ) column operations. � n − k � � n � Success probability: P = / . w w Total cost = K / P . 20 39

  33. Stern and Dumer’s algorithm k + ℓ ✛ ✲ 1 ✻ H ′′ s ′′ ❅ n − k − ℓ ❅ UHP = Us = ❄ 1 ✻ 0 H ′ s ′ ℓ ❄ w − p w − p p p 21 39

  34. Stern and Dumer’s algorithm k + ℓ ✛ ✲ Step 3 1 ✻ H ′′ s ′′ ❅ n − k − ℓ ❅ UHP = Us = ❄ 1 ✻ 0 H ′ s ′ ℓ ❄ Step 2 w − p w − p p p 21 39

  35. Stern and Dumer’s algorithm k + ℓ ✛ ✲ Step 3 1 ✻ H ′′ s ′′ ❅ n − k − ℓ ❅ UHP = Us = ❄ 1 ✻ 0 H ′ s ′ ℓ ❄ Step 2 w − p w − p p p  1. Permutation + partial Gaussian elimination   2. Find many e ′ such that | e ′ | = p and H ′ e ′ = s ′ Repeat:  3. For all good e ′ , test | s ′′ + H ′′ e ′ | ≤ w − p  Step 2 is Birthday Decoding (or whatever is best); Step 3 is (a kind of) Prange; Total cost is minimized over ℓ and p . 21 39

  36. Stern and Dumer’s algorithm � k + ℓ � k + ℓ � � �� k + ℓ p p � Iteration cost: K = n ( n − k − ℓ ) + 2 + + p 2 ℓ 2 ℓ 22 39

  37. Stern and Dumer’s algorithm � k + ℓ � k + ℓ � � �� k + ℓ p p � Iteration cost: K = n ( n − k − ℓ ) + 2 + + p 2 ℓ 2 ℓ � �� � � ✒ � Gaussian elimination 22 39

  38. Stern and Dumer’s algorithm � k + ℓ � k + ℓ � � �� k + ℓ p p � Iteration cost: K = n ( n − k − ℓ ) + 2 + + p 2 ℓ 2 ℓ � �� � � �� � � ✒ � ✻ Gaussian elimination Birthday decoding 22 39

  39. Stern and Dumer’s algorithm � k + ℓ � k + ℓ � � �� k + ℓ p p � Iteration cost: K = n ( n − k − ℓ ) + 2 + + p 2 ℓ 2 ℓ � �� � � �� � � �� � � ✒ � ✻ Gaussian elimination ✻ Final check Birthday decoding 22 39

  40. Stern and Dumer’s algorithm � k + ℓ � k + ℓ � � �� k + ℓ p p � Iteration cost: K = n ( n − k − ℓ ) + 2 + + p 2 ℓ 2 ℓ � ✒ � ✻ Gaussian elimination ✻ Final check Birthday decoding � k + ℓ �� n − k − ℓ � p w − p Success probability: P = . � n � w Total cost = K / P , minimized over p and ℓ . 22 39

  41. More advanced algorithms Improved Birthday Decoding: overlapping support. Representations. Recursive Birthday Decoding. Decoding One Out of Many. Nearest Neighbour approach. 23 39

  42. Complexity Theoretical asymptotic exponent Best algorithm solves SD ( n , W , R ) in 2 c · n operations with 1962 c = 0.121 [Pra62] 1988 c = 0.117 [Ste88, Dum89] 2011 c = 0.112 [MMT11] 2012 c = 0.102 [BJMM12] 2017 c = 0.095 [MO15, BM17] 2018 c = 0.089 [BM18] for w = d GV and worst choice of k . 24 39

  43. Complexity Theoretical asymptotic exponent Best algorithm solves SD ( n , W , R ) in 2 c · n operations with 1962 c = 0.121 [Pra62] 1988 c = 0.117 [Ste88, Dum89] 2011 c = 0.112 [MMT11] 2012 c = 0.102 [BJMM12] 2017 c = 0.095 [MO15, BM17] 2018 c = 0.089 [BM18] for w = d GV and worst choice of k . Practical complexity? 24 39

  44. The Decoding Challenge

  45. decodingchallenge.org 25 39

  46. The Decoding Challenge Launched in August 2019 by Aragon, Lavauzelle and L. Goal: ◮ assess the practical complexity of problems in coding theory; ◮ motivate the implementation of ISD algorithms; ◮ increase the confidence in code-based crypto. 26 39

  47. The Decoding Challenge Launched in August 2019 by Aragon, Lavauzelle and L. Goal: ◮ assess the practical complexity of problems in coding theory; ◮ motivate the implementation of ISD algorithms; ◮ increase the confidence in code-based crypto. Concept: ◮ 4 categories of challenges; ◮ instances of increasing size; ◮ a hall of fame. 26 39

  48. 4 categories of challenges 2 generic problems ◮ Syndrome Decoding k / n = 0 . 5 and w = d GV ◮ Finding the Lowest Codeword for k / n = 0 . 5 and n of cryptographic size 2 problems based on schemes in the NIST competition ◮ Goppa-McEliece k / n = 0 . 8 and w = ( n − k ) / log 2 ( n ) k / n = 0 . 5 and w = √ n ◮ QC-MDPC 27 39

  49. Questions raised by implementation Based on previous work from Landais, Sendrier, Meurer and Hochbach, and recent work from Vasseur, Couvreur, Kunz and L. Choice of parameters p , ℓ , ε ... must be integers! Random shuffle vs. Canteaut-Chabaud. Birthday algorithm: sort vs. hash table. Allowing overlap? Early abort? ... It’s not just about asymptotic exponents anymore! 28 39

  50. Try the Challenge! decodingchallenge.org How to contribute? ◮ Solve some challenges! ◮ Talk about the project to other people. ◮ Propose this as a student project. ◮ Contact us if you want to help. 29 39

  51. Try the Challenge! decodingchallenge.org How to contribute? ◮ Solve some challenges! ◮ Talk about the project to other people. ◮ Propose this as a student project. ◮ Contact us if you want to help. Current leader of the Hall of Fame: Valentin Vasseur, n = 450 (for SD) ≃ 2 47 operations (Dumer). You dream to read your name in a Hall of Fame? This is the chance of a lifetime! 29 39

  52. Future challenges W e intend to propose other categories of challenges ◮ rank-metric Syndrome decoding; ◮ q -ary Syndrome Decoding in Hamming metric; ◮ q -ary Syndrome Decoding in Hamming metric with large weight. 30 39

  53. q -ary Syndrome Decoding

  54. Binary vs. ternary Decoding Challenge for R = 1 / 2: 31 39

  55. Binary vs. ternary Decoding Challenge for R = 1 / 2: 31 39

  56. Binary vs. ternary Decoding Challenge for R = 1 / 2: 31 39

  57. Binary vs. ternary Decoding Challenge for R = 1 / 5: 31 39

Recommend


More recommend