mds matrices with lightweight circuits
play

MDS Matrices with Lightweight Circuits S ebastien Duval Ga etan - PowerPoint PPT Presentation

MDS Matrices with Lightweight Circuits S ebastien Duval Ga etan Leurent March 26, 2019 Introduction Objective Search Algorithm Instantiating the Results Comparison with the Literature 2 / 18 SPN Ciphers Plaintext Shannons


  1. MDS Matrices with Lightweight Circuits S´ ebastien Duval Ga¨ etan Leurent March 26, 2019

  2. Introduction Objective Search Algorithm Instantiating the Results Comparison with the Literature 2 / 18 SPN Ciphers Plaintext Shannon’s criteria K 0 1 Diffusion S S S S - Every bit of plaintext and key must affect every bit of the output L - We usually use linear functions 2 Confusion K 1 - Relation between plaintext and ciphertext must be intractable S S S S - Requires non-linear operations L - Often implemented with tables: S-Boxes K 2 Example: Rijndael/AES [Daemen Rijmen 1998] Ciphertext

  3. Introduction Objective Search Algorithm Instantiating the Results Comparison with the Literature 3 / 18 Block Cipher Security Analysis x x ⊕ a Differential Attacks [Biham Shamir 91] K 0 K 0 ◮ Attacker exploits (a,b) such that S S S S S S S S E K ( x ) ⊕ E K ( x ⊕ a ) = b L L with high probability ◮ Maximum of the probability K 1 K 1 over all ( a , b ) bounded by S S S S S S S S � B d ( L ) − 1 � δ ( S ) L L 2 n K 2 K 2 y y ⊕ b

  4. Introduction Objective Search Algorithm Instantiating the Results Comparison with the Literature 4 / 18 MDS Matrices Differential Branch Number B d ( L ) = min x � =0 { w ( x ) + w ( L ( x )) } where w ( x ) is the number of L non-zero n -bits words in x . Linear Branch Number x � =0 { w ( x ) + w ( L ⊤ ( x )) } L linear permutation B l ( L ) = min on k words of n bits.

  5. Introduction Objective Search Algorithm Instantiating the Results Comparison with the Literature 4 / 18 MDS Matrices Differential Branch Number B d ( L ) = min x � =0 { w ( x ) + w ( L ( x )) } where w ( x ) is the number of L non-zero n -bits words in x . Linear Branch Number x � =0 { w ( x ) + w ( L ⊤ ( x )) } L linear permutation B l ( L ) = min on k words of n bits. Maximum branch number : k + 1 Equivalent to MDS codes.

  6. Introduction Objective Search Algorithm Instantiating the Results Comparison with the Literature 4 / 18 MDS Matrices Differential Branch Number B d ( L ) = min x � =0 { w ( x ) + w ( L ( x )) } where w ( x ) is the number of L non-zero n -bits words in x . Linear Branch Number x � =0 { w ( x ) + w ( L ⊤ ( x )) } L linear permutation B l ( L ) = min on k words of n bits. Maximum branch number : k + 1 Equivalent to MDS codes.

  7. Introduction Objective Search Algorithm Instantiating the Results Comparison with the Literature 5 / 18 Matrices and Characterisation Usually on finite fields:   2 3 1 1 x a primitive element of F n 2 1 2 3 1   Coeffs. ∈ F 2 [ x ] / P , with P a   1 1 2 3   primitive polynomial 3 1 1 2 2 ↔ x 3 ↔ x + 1 AES MixColumns Characterisation L is MDS iff its minors are non-zero

  8. Introduction Objective Search Algorithm Instantiating the Results Comparison with the Literature 6 / 18 Previous Works Recursive Matrices [Guo et al. 2011] A lightweight matrix A i MDS Implement A , then iterate A i times. Optimizing Coefficients ◮ Structured matrices: restrict to a small subspace with many MDS matrices ◮ More general than finite fields: inputs are binary vectors, matrix coeffs. are n × n matrices. ⇒ less costly operations than multiplication in a finite field

  9. Introduction Objective Search Algorithm Instantiating the Results Comparison with the Literature 7 / 18 Cost Evaluation “Real cost” Number of operations of the best implementation. Xor count (naive cost) Hamming weight of the binary matrix. Cannot reuse intermediate values. Intermediate values ◮ Local optimisation : Lighter [Jean et al. 2017] cost of matrix multiplication = number of XORs + cost of the mult. by each coefficent. ◮ Global optimisation : ◮ Hardware synthesis: straight line programs [Kranz et al. 2018]. Heuristics to implement binary matrices. ◮ Our approach: Number of operations of the best implementation using operations on words.

  10. Introduction Objective Search Algorithm Instantiating the Results Comparison with the Literature 8 / 18 Metrics Comparison x 0 x 1 x 2   3 2 2 2 3 2   2 2 3 × 2  6 mult. by 2 �  1 mult. by 2  Xor Count: 3 mult. by 3 Our approach: 5 XORS  6 XORS 

  11. Introduction Objective Search Algorithm Instantiating the Results Comparison with the Literature 9 / 18 Formal Matrices Formal matrices x 0 x 1 x 2 ◮ Optimise in 2 steps: 1 Find M ( α ) for α an undefined linear mapping. � α +1 α α � α α +1 2 Instantiate with the best choice of α α α α α α +1 ◮ Not necessarily a finite field. ◮ Then coeffs. are polynomials in α .

  12. Introduction Objective Search Algorithm Instantiating the Results Comparison with the Literature 9 / 18 Formal Matrices Formal matrices x 0 x 1 x 2 ◮ Optimise in 2 steps: 1 Find M ( α ) for α an undefined linear mapping. � α +1 α α � α α +1 2 Instantiate with the best choice of α α α α α α +1 ◮ Not necessarily a finite field. ◮ Then coeffs. are polynomials in α . Characterisation of formally MDS matrices ◮ Objective: find M ( α ) s.t. ∃ A , M ( A ) MDS. ◮ If a minor of M ( α ) is null, then impossible. ◮ Otherwise, there always exists an A . Characterisation possible on M ( α ).

  13. Introduction Objective Search Algorithm Instantiating the Results Comparison with the Literature 10 / 18 Search Space Search over circuits Search Space Operations: ◮ word-wise XOR ◮ α (generalization of a multiplication) ◮ Copy Note: Only word-wise operations. r registers: one register per word (3 for 3 × 3) + (at least) one more register → more complex operations

  14. Introduction Objective Search Algorithm Instantiating the Results Comparison with the Literature 11 / 18 Implementation: Main Idea Tree-based Dijkstra search ◮ Node = matrix = sequence of operations ◮ Lightest circuit = shortest path to MDS matrix ◮ When we spawn a node, we test if it is MDS Search results ◮ k = 3 fast (seconds) ◮ k = 4 long (hours) ◮ k = 5 out of reach ◮ Collection of MDS matrices with trade-off between cost and depth (latency).

  15. Introduction Objective Search Algorithm Instantiating the Results Comparison with the Literature 12 / 18 Scheme of the Search x 0 x 1 x 2 x 3 x 0 x 1 x 2 x 3 x 0 x 1 x 2 x 3 . . . . . . x 0 x 1 x 2 x 3 x 0 x 1 x 2 x 3 x 0 x 1 x 2 x 3 x 0 x 1 x 2 x 3 x 0 x 1 x 2 x 3 x 0 x 1 x 2 x 3 x 0 x 1 x 2 x 3 x 0 x 1 x 2 x 3 x 0 x 1 x 2 x 3 x 0 x 1 x 2 x 3 x 0 x 1 x 2 x 3 x 0 x 1 x 2 x 3 α α α α α α α α

  16. Introduction Objective Search Algorithm Instantiating the Results Comparison with the Literature 13 / 18 Optimization: A ∗ A ∗ Idea of A ∗ ◮ Guided Dijkstra ◮ weight = weight from origin + estimated weight to objective

  17. Introduction Objective Search Algorithm Instantiating the Results Comparison with the Literature 13 / 18 Optimization: A ∗ A ∗ Idea of A ∗ ◮ Guided Dijkstra ◮ weight = weight from origin + estimated weight to objective Our estimate:

  18. Introduction Objective Search Algorithm Instantiating the Results Comparison with the Literature 13 / 18 Optimization: A ∗ A ∗ Idea of A ∗ ◮ Guided Dijkstra ◮ weight = weight from origin + estimated weight to objective Our estimate: ◮ Heuristic ◮ How far from MDS ?

  19. Introduction Objective Search Algorithm Instantiating the Results Comparison with the Literature 13 / 18 Optimization: A ∗ A ∗ Idea of A ∗ ◮ Guided Dijkstra ◮ weight = weight from origin + estimated weight to objective Our estimate: ◮ Heuristic ◮ How far from MDS ? ◮ Column with a 0: cannot be part of MDS matrix

  20. Introduction Objective Search Algorithm Instantiating the Results Comparison with the Literature 13 / 18 Optimization: A ∗ A ∗ Idea of A ∗ ◮ Guided Dijkstra ◮ weight = weight from origin + estimated weight to objective Our estimate: ◮ Heuristic ◮ How far from MDS ? ◮ Column with a 0: cannot be part of MDS matrix ◮ Linearly dependent columns: not part of MDS matrix

  21. Introduction Objective Search Algorithm Instantiating the Results Comparison with the Literature 13 / 18 Optimization: A ∗ A ∗ Idea of A ∗ ◮ Guided Dijkstra ◮ weight = weight from origin + estimated weight to objective Our estimate: ◮ Heuristic ◮ How far from MDS ? ◮ Column with a 0: cannot be part of MDS matrix ◮ Linearly dependent columns: not part of MDS matrix ◮ Estimate: m = rank of the matrix (without columns containing 0) ◮ Need at least k − m word-wise XORs to MDS Result: much faster

  22. Introduction Objective Search Algorithm Instantiating the Results Comparison with the Literature 14 / 18 Methodology of the Instantiation The Idea 1 Input: Formal matrix M ( α ) MDS 2 Output: M ( A ) MDS, with A a linear mapping (the lightest we can find)

  23. Introduction Objective Search Algorithm Instantiating the Results Comparison with the Literature 15 / 18 Characterisation of MDS Instantiations MDS Test ◮ Intuitive approach: ◮ Choose A a linear mapping ◮ Evaluate M ( A ) ◮ See if all minors are non-singular

Recommend


More recommend