generalization of the ball collision algorithm
play

Generalization of the Ball-Collision Algorithm Violetta Weger joint - PowerPoint PPT Presentation

Generalization of the Ball-Collision Algorithm Violetta Weger joint work with Carmelo Interlando, Karan Khathuria, Nicole Rohrer and Joachim Rosenthal University of Zurich 7th Code-Based Cryptography Workshop 19 May 2019 Violetta Weger


  1. Generalization of the Ball-Collision Algorithm Violetta Weger joint work with Carmelo Interlando, Karan Khathuria, Nicole Rohrer and Joachim Rosenthal University of Zurich 7th Code-Based Cryptography Workshop 19 May 2019 Violetta Weger Ball-Collision Algorithm

  2. Outline Violetta Weger Ball-Collision Algorithm 1 Motivation 2 Introduction 3 Prange’s Algorithm 4 Improvements overview 5 Ball-collision Algorithm 6 New directions 7 Comparison of Complexities 8 Open questions

  3. Motivation Proposing a code-based cryptosystem Structural Attacks Nonstructural Attacks Have to consider Information Set Decoding (ISD) Violetta Weger Ball-Collision Algorithm

  4. Motivation Proposing a code-based cryptosystem Structural Attacks Nonstructural Attacks Have to consider Information Set Decoding (ISD) Violetta Weger Ball-Collision Algorithm

  5. ISD algorithms and syndome decoding problem 1978 Berlekamp, McEliece and van Tilborg: Decoding a random linear code is NP-complete Problem (Syndrome decoding problem) Given a parity check matrix H of a (binary) code of length n and dimension k and a syndrome s: 2 weight t such that Violetta Weger Ball-Collision Algorithm s = Hx ⊺ ∈ F n − k and the error correction capacity t, we want to fjnd e ∈ F n 2 of s = He ⊺ .

  6. ISD algorithms and syndome decoding problem • Syndrome decoding problem is equivalent to the decoding problem and Problem (Decoding problem) Given a generator matrix G of a (binary) code of length n and dimension k and a corrupted codeword c: 2 weight t. • equivalent to fjnding a minimum weight codeword, since in codeword. Violetta Weger Ball-Collision Algorithm c = mG + e ∈ F n and the error correction capacity t, we want to fjnd e ∈ F n 2 of C + { 0 , c } the error vector e is now the minimum weight

  7. Informationset q , we denote by Violetta Weger is such that Defjnition (Informationset) Notation Ball-Collision Algorithm q Let c ∈ F n q and A ∈ F k × n , let S ⊂ { 1 , . . . , n } , then we denote by c S the restriction of c to the entries indexed by S and by A S the columns of A indexed by S. For a code C ⊂ F n C S = { c S | c ∈ C} . Let C ⊂ F n q be a code of dimension k. If I ⊂ { 1 , . . . , n } of size k | C | = | C I | , then we call I an information set of C .

  8. Informationset q , we denote by Violetta Weger is such that Defjnition (Informationset) Notation Ball-Collision Algorithm q Let c ∈ F n q and A ∈ F k × n , let S ⊂ { 1 , . . . , n } , then we denote by c S the restriction of c to the entries indexed by S and by A S the columns of A indexed by S. For a code C ⊂ F n C S = { c S | c ∈ C} . Let C ⊂ F n q be a code of dimension k. If I ⊂ { 1 , . . . , n } of size k | C | = | C I | , then we call I an information set of C .

  9. Informationset Defjnition (Informationset) Defjnition (Informationset) Violetta Weger Ball-Collision Algorithm Let G be the k × n generator matrix of C . If I ⊂ { 1 , . . . , n } of size k is such that G I is invertible, then I is an informationset of C . Let H be the n − k × n parity check matrix of C . If I ⊂ { 1 , . . . , n } of size k is such that H I c is invertible, then I is an informationset of C .

  10. Informationset Defjnition (Informationset) Defjnition (Informationset) Violetta Weger Ball-Collision Algorithm Let G be the k × n generator matrix of C . If I ⊂ { 1 , . . . , n } of size k is such that G I is invertible, then I is an informationset of C . Let H be the n − k × n parity check matrix of C . If I ⊂ { 1 , . . . , n } of size k is such that H I c is invertible, then I is an informationset of C .

  11. Prange’s algorithm 1962 Prange proposes the fjrst ISD algorithm. Violetta Weger 4 Else start over. such that 2 Ball-Collision Algorithm 2 2 set. Assumption: All t errors occur outside of the information Input: H ∈ F n − k × n , s ∈ F n − k , t ∈ N 2 , wt ( e ) = t and He ⊺ = s . Output: e ∈ F n 1 Choose an information set I ⊂ { 1 , . . . , n } of size k . 2 Find an invertible matrix U ∈ F n − k × n − k ( UH ) I = A and ( UH ) I c = Id n − k . 3 If wt ( Us ) = t , then e I = 0 and e I c = Us .

  12. UH I A and UH I c Id n k hence UHe A Id n Prange’s algorithm UH k 0 e I c Us From which we get the condition e I c Us . Violetta Weger A 4 Else start over. Us . 0 and e I c t , then e I 3 If wt Us k . Id n such that 2 k n k n 2 Find an invertible matrix U Ball-Collision Algorithm 1 Choose an information set I ⊂ { 1 , . . . , n } of size k . Let us assume for simplicity that I = { 1 , . . . , k } .

  13. Prange’s algorithm 4 Else start over. Violetta Weger Us . From which we get the condition e I c e I c A hence A Ball-Collision Algorithm Us . 0 and e I c t , then e I 3 If wt Us 2 such that 1 Choose an information set I ⊂ { 1 , . . . , n } of size k . 2 Find an invertible matrix U ∈ F n − k × n − k ( UH ) I = A and ( UH ) I c = Id n − k . Let us assume for simplicity that I = { 1 , . . . , k } . ( ) UH = , Id n − k ) ( 0 ) UHe ⊺ = ( = Us . Id n − k

  14. Prange’s algorithm A Violetta Weger e I c A hence Ball-Collision Algorithm 4 Else start over. 2 such that 1 Choose an information set I ⊂ { 1 , . . . , n } of size k . 2 Find an invertible matrix U ∈ F n − k × n − k ( UH ) I = A and ( UH ) I c = Id n − k . 3 If wt ( Us ) = t , then e I = 0 and e I c = Us . Let us assume for simplicity that I = { 1 , . . . , k } . ( ) UH = , Id n − k ) ( 0 ) UHe ⊺ = ( = Us . Id n − k From which we get the condition e I c = Us .

  15. Prange’s algorithm The cost of an ISD algorithm is given by the product of the cost of one iteration, inverted success probability = average number of iterations needed. The success probability is given by the weight distribution of the error vector. Example (Success probability of Prange’s algorithm) t t Violetta Weger Ball-Collision Algorithm ( n − k )( n ) − 1 .

  16. Prange’s algorithm The cost of an ISD algorithm is given by the product of the cost of one iteration, inverted success probability = average number of iterations needed. The success probability is given by the weight distribution of the error vector. Example (Success probability of Prange’s algorithm) t t Violetta Weger Ball-Collision Algorithm ( n − k )( n ) − 1 .

  17. Prange’s algorithm The cost of an ISD algorithm is given by the product of the cost of one iteration, inverted success probability = average number of iterations needed. The success probability is given by the weight distribution of the error vector. Example (Success probability of Prange’s algorithm) t t Violetta Weger Ball-Collision Algorithm ( n − k )( n ) − 1 .

  18. Improvements Overview Violetta Weger Ball-Collision Algorithm

  19. Ball-collision Algorithm Violetta Weger Ball-Collision Algorithm

  20. 2 Partition I into X 1 and X 2 . 3 Partition Y into Y 1 Y 2 Y 3 . Ball-collision Algorithm 1 Choose an information set I . 4 Bring H in systematic form. Violetta Weger Ball-Collision Algorithm Let us assume for simplicity that I = { 1 , . . . , k } .

  21. 3 Partition Y into Y 1 Y 2 Y 3 . Ball-collision Algorithm 1 Choose an information set I . 4 Bring H in systematic form. Violetta Weger Ball-Collision Algorithm 2 Partition I into X 1 and X 2 . Let us assume for simplicity that I = { 1 , . . . , k } .

  22. Ball-collision Algorithm 1 Choose an information set I . 4 Bring H in systematic form. Violetta Weger Ball-Collision Algorithm 2 Partition I into X 1 and X 2 . 3 Partition Y into Y 1 , Y 2 , Y 3 . Let us assume for simplicity that I = { 1 , . . . , k } .

  23. Ball-collision Algorithm 0 Violetta Weger We get the conditions s 2 e 3 e 2 1 Choose an information set I . e 1 A 2 0 4 Bring H in systematic form. Ball-Collision Algorithm 2 Partition I into X 1 and X 2 . 3 Partition Y into Y 1 , Y 2 , Y 3 . )   ( A 1 ( s 1 ) UHe ⊺ = Id ℓ 1 + ℓ 2  = = Us .  Id ℓ 3 A 1 e 1 + e 2 = s 1 , A 2 e 1 + e 3 = s 2 .

  24. Ball-collision Algorithm 0 Violetta Weger We get the conditions s 2 e 3 e 2 1 Choose an information set I . e 1 A 2 0 4 Bring H in systematic form. Ball-Collision Algorithm 2 Partition I into X 1 and X 2 . 3 Partition Y into Y 1 , Y 2 , Y 3 . )   ( A 1 ( s 1 ) UHe ⊺ = Id ℓ 1 + ℓ 2  = = Us .  Id ℓ 3 A 1 e 1 + e 2 = s 1 , A 2 e 1 + e 3 = s 2 .

  25. Ball-collision Algorithm Conditions: Assumptions: Violetta Weger Ball-Collision Algorithm A 1 e 1 + e 2 = s 1 , A 2 e 1 + e 3 = s 2 . a e 1 has support in I = X 1 ∪ X 2 and weight 2 v b e 2 has support in Y 1 ∪ Y 2 and weight 2 w c e 3 has support in Y 3 and weight t − 2 v − 2 w

  26. Ball-collision Algorithm Violetta Weger Ball-Collision Algorithm a e 1 has support in I = X 1 ∪ X 2 and weight 2 v

  27. Ball-collision Algorithm Violetta Weger Ball-Collision Algorithm a e 1 has support in I = X 1 ∪ X 2 and weight 2 v b e 2 has support in Y 1 ∪ Y 2 and weight 2 w

  28. Ball-collision Algorithm (1) (2) For condition (1): (1) is satisfjed. For condition (2): Violetta Weger Ball-Collision Algorithm A 1 e 1 + e 2 = s 1 , A 2 e 1 + e 3 = s 2 . go through all choices of e 1 and e 2 and check with collision if defjne e 3 = s 2 − A 2 e 1 and check if e 3 has weight t − 2 v − 2 w .

  29. Ball-collision Algorithm w Violetta Weger t Success probability: w v v Ball-Collision Algorithm ) − 1 ( ⌊ k / 2 ⌋ )( ⌈ k / 2 ⌉ )( ⌊ ℓ/ 2 ⌋ )( ⌈ ℓ/ 2 ⌉ )( n − k − ℓ )( n . n − 2 v − 2 w

Recommend


More recommend