computing information flow using symbolic model checking
play

Computing Information Flow Using Symbolic Model-Checking Rohit Chadha - PowerPoint PPT Presentation

Computing Information Flow Using Symbolic Model-Checking Rohit Chadha 1 Umang Mathur 2 Stefan Schwoon 3 1 University of Missouri Columbia, Missouri, USA 2 Indian Institute of Technology - Bombay Mumbai 3 LSV, ENS Cachan France December 17, 2014


  1. Computing Information Flow Using Symbolic Model-Checking Rohit Chadha 1 Umang Mathur 2 Stefan Schwoon 3 1 University of Missouri Columbia, Missouri, USA 2 Indian Institute of Technology - Bombay Mumbai 3 LSV, ENS Cachan France December 17, 2014

  2. Outline Introduction Preliminaries Summary Calculation Computing Information Leakage: Symbolic Algorithms Moped-QLeak Demo Conclusions and Future Work Thank You

  3. Introduction ◮ Quantifying information leakage - Inferring information about inputs by observing public outputs

  4. Introduction ◮ Quantifying information leakage - Inferring information about inputs by observing public outputs ◮ No leakage = ⇒ Outputs independent of inputs

  5. Introduction ◮ Quantifying information leakage - Inferring information about inputs by observing public outputs ◮ No leakage = ⇒ Outputs independent of inputs ◮ Full leakage = ⇒ Unique input corresponding to given output

  6. Introduction ◮ Quantifying information leakage - Inferring information about inputs by observing public outputs ◮ No leakage = ⇒ Outputs independent of inputs ◮ Full leakage = ⇒ Unique input corresponding to given output ◮ Comparing leakage across programs - less leakage is desirable

  7. Measuring Information Leakage

  8. Measuring Information Leakage Several metrics - min-entropy, Shannon’s entropy, etc.,

  9. Measuring Information Leakage Several metrics - min-entropy, Shannon’s entropy, etc., 1. Min-entropy leakage measures vulnerability of the secret inputs to being guessed correctly in a single attempt of the adversary � ME U ( P ) = log max s ∈S µ ( S = s | O = o ) . o ∈O

  10. Measuring Information Leakage Several metrics - min-entropy, Shannon’s entropy, etc., 1. Min-entropy leakage measures vulnerability of the secret inputs to being guessed correctly in a single attempt of the adversary � ME U ( P ) = log max s ∈S µ ( S = s | O = o ) . o ∈O 2. Shannon entropy leakage measures expected number of guesses required to correctly guess the secret input SE U ( P ) = log | S | − 1 � | P − 1 ( o ) | log | P − 1 ( o ) | | S | o ∈ O

  11. Example Consider the following example:

  12. Example Consider the following example: def example (input) : output = input % 8 return output

  13. Example Consider the following example: def example (input) : output = input % 8 return output What would be the information leaked by the above program

  14. Example Consider the following example: def example (input) : output = input % 8 return output What would be the information leaked by the above program ◮ using min-entropy ?

  15. Example Consider the following example: def example (input) : output = input % 8 return output What would be the information leaked by the above program ◮ using min-entropy ? ◮ using Shannon entropy ?

  16. Dining Cryptographers

  17. Dining Cryptographers ◮ Cryptographers A, B and C: Dine out

  18. Dining Cryptographers ◮ Cryptographers A, B and C: Dine out

  19. Dining Cryptographers ◮ Cryptographers A, B and C: Dine out ◮ Payment done by

  20. Dining Cryptographers ◮ Cryptographers A, B and C: Dine out ◮ Payment done by ◮ One of A, B or C, or

  21. Dining Cryptographers ◮ Cryptographers A, B and C: Dine out ◮ Payment done by ◮ One of A, B or C, or ◮ NSA

  22. Dining Cryptographers ◮ Cryptographers A, B and C: Dine out ◮ Payment done by ◮ One of A, B or C, or ◮ NSA ◮ Determine if the NSA paid or not w/o revealing information about cryptographers

  23. Dining Cryptographers: Protocol 2 stage protocol:

  24. Dining Cryptographers: Protocol 2 stage protocol: 1. Every two cryptographers establish a shared one-bit secret : Toss a coin

  25. Dining Cryptographers: Protocol 2 stage protocol: 1. Every two cryptographers establish a shared one-bit secret : Toss a coin 2. Each cryptographer publicly announces a bit, which is

  26. Dining Cryptographers: Protocol 2 stage protocol: 1. Every two cryptographers establish a shared one-bit secret : Toss a coin 2. Each cryptographer publicly announces a bit, which is ◮ XOR of shared bits, if did not pay

  27. Dining Cryptographers: Protocol 2 stage protocol: 1. Every two cryptographers establish a shared one-bit secret : Toss a coin 2. Each cryptographer publicly announces a bit, which is ◮ XOR of shared bits, if did not pay ◮ ¬ (XOR of shared bits), otherwise

  28. Dining Cryptographers: Protocol 2 stage protocol: 1. Every two cryptographers establish a shared one-bit secret : Toss a coin 2. Each cryptographer publicly announces a bit, which is ◮ XOR of shared bits, if did not pay ◮ ¬ (XOR of shared bits), otherwise ¬ XOR(0 , 1) = 0 1 0 1 XOR(1 , 1) = 0 XOR(0 , 1) = 1

  29. Dining Cryptographers: Protocol 2 stage protocol: 1. Every two cryptographers establish a shared one-bit secret : Toss a coin 2. Each cryptographer publicly announces a bit, which is ◮ XOR of shared bits, if did not pay ◮ ¬ (XOR of shared bits), otherwise ¬ XOR(0 , 1) = 0 1 0 1 XOR(1 , 1) = 0 XOR(0 , 1) = 1 Stage-1 (left) and Stage-2 (right)

  30. Dining Cryptographers: Protocol 2 stage protocol: 1. Every two cryptographers establish a shared one-bit secret : Toss a coin 2. Each cryptographer publicly announces a bit, which is ◮ XOR of shared bits, if did not pay ◮ ¬ (XOR of shared bits), otherwise ¬ XOR(0 , 1) = 0 1 0 1 XOR(1 , 1) = 0 XOR(0 , 1) = 1 Stage-1 (left) and Stage-2 (right) XOR(Announcement A , Announcement B , Announcement C ) = 0 iff NSA paid for the dinner

  31. Probabilistic Boolean Programs ◮ Global variablesn G : Input and output

  32. Probabilistic Boolean Programs ◮ Global variablesn G : Input and output ◮ Local variables: Internal calculations

  33. Probabilistic Boolean Programs ◮ Global variablesn G : Input and output ◮ Local variables: Internal calculations ◮ Program statements : transform global and local variables

  34. Probabilistic Boolean Programs ◮ Global variablesn G : Input and output ◮ Local variables: Internal calculations ◮ Program statements : transform global and local variables ◮ For Program P , F P : 2 G → 2 G ∪ {⊥}

  35. Probabilistic Boolean Programs ◮ Global variablesn G : Input and output ◮ Local variables: Internal calculations ◮ Program statements : transform global and local variables ◮ For Program P , F P : 2 G → 2 G ∪ {⊥} ◮ F P ( ¯ g 0 ) = ⊥ iff P does not terminate

  36. Probabilistic Boolean Programs ◮ Global variablesn G : Input and output ◮ Local variables: Internal calculations ◮ Program statements : transform global and local variables ◮ For Program P , F P : 2 G → 2 G ∪ {⊥} ◮ F P ( ¯ g 0 ) = ⊥ iff P does not terminate ◮ Summary - Joint probability distribution µ

  37. Probabilistic Boolean Programs ◮ Global variablesn G : Input and output ◮ Local variables: Internal calculations ◮ Program statements : transform global and local variables ◮ For Program P , F P : 2 G → 2 G ∪ {⊥} ◮ F P ( ¯ g 0 ) = ⊥ iff P does not terminate ◮ Summary - Joint probability distribution µ

  38. Algebraic Decision Diagrams ◮ Set of variables V

  39. Algebraic Decision Diagrams ◮ Set of variables V ◮ Algebraic set M ( M = [0 , 1] for probabilistic statements, M = { 0 , 1 } implies BDDs)

  40. Algebraic Decision Diagrams ◮ Set of variables V ◮ Algebraic set M ( M = [0 , 1] for probabilistic statements, M = { 0 , 1 } implies BDDs) ◮ ADD : 2 V → M

  41. Algebraic Decision Diagrams ◮ Set of variables V ◮ Algebraic set M ( M = [0 , 1] for probabilistic statements, M = { 0 , 1 } implies BDDs) ◮ ADD : 2 V → M ◮ Efficient reduced representations, similar to BDDs

  42. Algebraic Decision Diagrams ◮ Set of variables V ◮ Algebraic set M ( M = [0 , 1] for probabilistic statements, M = { 0 , 1 } implies BDDs) ◮ ADD : 2 V → M ◮ Efficient reduced representations, similar to BDDs x y y z z z z 1 0 0 1 0 . 5 0 . 5 0 . 5 0 . 5 x y z z 1 0 . 5

  43. Algebraic Decision Diagrams ◮ Set of variables V ◮ Algebraic set M ( M = [0 , 1] for probabilistic statements, M = { 0 , 1 } implies BDDs) ◮ ADD : 2 V → M ◮ Efficient reduced representations, similar to BDDs x y y z z z z 1 0 0 1 0 . 5 0 . 5 0 . 5 0 . 5 x y z z 1 0 . 5 ADD (up) and its reduced form (bottom)

  44. Computing Summaries: Fixed Point Iteration ◮ Program statement l → µ l

  45. Computing Summaries: Fixed Point Iteration ◮ Program statement l → µ l ◮ Can be represented efficiently as MTBBDs

  46. Computing Summaries: Fixed Point Iteration ◮ Program statement l → µ l ◮ Can be represented efficiently as MTBBDs Stmt 0x2c x Stmt: x = !x x' 0x29 0x2b 0 1

  47. Computing Summaries: Fixed Point Iteration ◮ Program statement l → µ l ◮ Can be represented efficiently as MTBBDs Stmt 0x2c x Stmt: x = !x x' 0x29 0x2b 0 1 ◮ Compose statements

  48. Computing Summaries: Fixed Point Iteration ◮ Program statement l → µ l ◮ Can be represented efficiently as MTBBDs Stmt 0x2c x Stmt: x = !x x' 0x29 0x2b 0 1 ◮ Compose statements ◮ Arrive at a fixed point (Summary µ )

  49. Min Entropy : Symbolic Algorithm For a program P , with

  50. Min Entropy : Symbolic Algorithm For a program P , with ◮ input set S (uniform distribution),

  51. Min Entropy : Symbolic Algorithm For a program P , with ◮ input set S (uniform distribution), ◮ output set O , and,

Recommend


More recommend