probabilistic couplings for cryptography and privacy
play

Probabilistic couplings for cryptography and privacy Gilles Barthe - PowerPoint PPT Presentation

Probabilistic couplings for cryptography and privacy Gilles Barthe IMDEA Software Institute, Madrid, Spain September 13, 2016 Relational properties Properties about two runs of the same program Assume inputs are related by Want to


  1. Probabilistic couplings for cryptography and privacy Gilles Barthe IMDEA Software Institute, Madrid, Spain September 13, 2016

  2. Relational properties Properties about two runs of the same program ◮ Assume inputs are related by Ψ ◮ Want to prove the outputs are related by Φ

  3. Examples Monotonicity ◮ Ψ : in 1 ≤ in 2 ◮ Φ : out 1 ≤ out 2 ◮ “Bigger inputs give bigger outputs”

  4. Examples Monotonicity ◮ Ψ : in 1 ≤ in 2 ◮ Φ : out 1 ≤ out 2 ◮ “Bigger inputs give bigger outputs” Stability ◮ Ψ : inp 1 ∼ inp 2 ◮ Φ : out 1 ∼ out 2 ◮ “If inputs are similar, then outputs are similar”

  5. Examples Monotonicity ◮ Ψ : in 1 ≤ in 2 ◮ Φ : out 1 ≤ out 2 ◮ “Bigger inputs give bigger outputs” Stability ◮ Ψ : inp 1 ∼ inp 2 ◮ Φ : out 1 ∼ out 2 ◮ “If inputs are similar, then outputs are similar” Non-interference ◮ Ψ : lowinp 1 = lowinp 2 ◮ Φ : lowout 1 = lowout 2 ◮ “If low inputs are equal, then low outputs are equal”

  6. Probabilistic relational properties Monotonicity ◮ Ψ : in 1 ≤ in 2 ◮ Φ : Pr [ out 1 ≥ k ] ≤ Pr [ out 2 ≥ k ]

  7. Probabilistic relational properties Monotonicity ◮ Ψ : in 1 ≤ in 2 ◮ Φ : Pr [ out 1 ≥ k ] ≤ Pr [ out 2 ≥ k ] Stability ◮ Ψ : in 1 ∼ in 2 ◮ Φ : Pr [ out 1 = k ] ∼ Pr [ out 2 = k ]

  8. Probabilistic relational properties Monotonicity ◮ Ψ : in 1 ≤ in 2 ◮ Φ : Pr [ out 1 ≥ k ] ≤ Pr [ out 2 ≥ k ] Stability ◮ Ψ : in 1 ∼ in 2 ◮ Φ : Pr [ out 1 = k ] ∼ Pr [ out 2 = k ] Non-interference ◮ Ψ : lowinp 1 = lowinp 2 ◮ Φ : Pr [ lowout 1 = k ] = Pr [ lowout 2 = k ]

  9. Probabilistic relational properties Monotonicity ◮ Ψ : in 1 ≤ in 2 ◮ Φ : Pr [ out 1 ≥ k ] ≤ Pr [ out 2 ≥ k ] Stability ◮ Ψ : in 1 ∼ in 2 ◮ Φ : Pr [ out 1 = k ] ∼ Pr [ out 2 = k ] Non-interference ◮ Ψ : lowinp 1 = lowinp 2 ◮ Φ : Pr [ lowout 1 = k ] = Pr [ lowout 2 = k ] Richer properties ◮ Indistinguishability, differential privacy

  10. Probabilistic couplings ◮ Used by mathematicians for proving relational properties ◮ Applications: Markov chains, probabilistic processes Idea ◮ Place two processes in the same probability space ◮ Coordinate the sampling

  11. Probabilistic couplings ◮ Used by mathematicians for proving relational properties ◮ Applications: Markov chains, probabilistic processes Idea ◮ Place two processes in the same probability space ◮ Coordinate the sampling Why is this interesting? ◮ Proving relational probabilistic properties reduced to proving non-relational non-probabilistic properties ◮ Compositional

  12. Introducing probabilistic couplings Basic ingredients ◮ Given: two distributions X 1 , X 2 over set A ◮ Produce: joint distribution Y over A × A ◮ Projection over the first component is X 1 ◮ Projection over the second component is X 2

  13. Introducing probabilistic couplings Basic ingredients ◮ Given: two distributions X 1 , X 2 over set A ◮ Produce: joint distribution Y over A × A ◮ Projection over the first component is X 1 ◮ Projection over the second component is X 2 Definition Given two distributions X 1 , X 2 over a set A , a coupling Y is a distribution over A × A such that π 1 ( Y ) = X 1 and π 2 ( Y ) = X 2

  14. Introducing probabilistic couplings Basic ingredients ◮ Given: two distributions X 1 , X 2 over set A ◮ Produce: joint distribution Y over A × A ◮ Projection over the first component is X 1 ◮ Projection over the second component is X 2 Definition Given two distributions X 1 , X 2 over a set A , a coupling Y is a distribution over A × A such that π 1 ( Y ) = X 1 and π 2 ( Y ) = X 2 where � π 1 ( Y )( a 1 ) = Y ( a 1 , a 2 ) a 2

  15. Fair coin toss ◮ One way to coordinate: require x 1 = x 2 ◮ A different way: require x 1 = ¬ x 2 ◮ Yet another way: product distribution ◮ Choice of coupling depends on application ◮ Couplings always exist

  16. Couplings vs liftings Let µ 1 , µ 2 ∈ Distr ( A ) , µ ∈ Distr ( A × A ) and R ⊆ A × A . Then µ ◭ R � µ 1 & µ 2 � � π 1 ( µ ) = µ 1 ∧ π 2 ( µ ) = µ 2 ∧ Pr y ← µ [ y ∈ R ] = 1 Different couplings yield liftings for different relations

  17. 1/2 1/2 Convergence of random walks Simple random walk on integers ◮ Start at some position p ◮ Each step, flip coin x $ ← flip ◮ Heads: p ← p + 1 ◮ Tails: p ← p − 1

  18. Convergence of random walks Simple random walk on integers ◮ Start at some position p ◮ Each step, flip coin x $ ← flip ◮ Heads: p ← p + 1 ◮ Tails: p ← p − 1 1/2 1/2

  19. Coupling the walks to meet Case p 1 = p 2 : Walks have met ◮ Arrange samplings x 1 = x 2 ◮ Continue to have p 1 = p 2

  20. Coupling the walks to meet Case p 1 = p 2 : Walks have met ◮ Arrange samplings x 1 = x 2 ◮ Continue to have p 1 = p 2 Case p 1 � = p 2 : Walks have not met ◮ Arrange samplings x 1 = ¬ x 2 ◮ Walks make mirror moves

  21. Coupling the walks to meet Case p 1 = p 2 : Walks have met ◮ Arrange samplings x 1 = x 2 ◮ Continue to have p 1 = p 2 Case p 1 � = p 2 : Walks have not met ◮ Arrange samplings x 1 = ¬ x 2 ◮ Walks make mirror moves Under coupling, if walks meet, they move together

  22. Why is this interesting? Memorylessness Positions converge as we take more steps

  23. Why is this interesting? Memorylessness Positions converge as we take more steps Coupling bounds distance between distributions ◮ Once walks meet, they stay equal ◮ Distance is at most probability walks don’t meet

  24. Why is this interesting? Memorylessness Positions converge as we take more steps Coupling bounds distance between distributions ◮ Once walks meet, they stay equal ◮ Distance is at most probability walks don’t meet Theorem If Y is a coupling of two distributions ( X 1 , X 2 ) , then � � X 1 − X 2 � TV � | X 1 ( a ) − X 2 ( a ) | ≤ ( y 1 , y 2 ) ∼ Y [ y 1 � = y 2 ] . Pr a ∈ A

  25. Why is this interesting? Memorylessness Positions converge as we take more steps Coupling bounds distance between distributions ◮ Once walks meet, they stay equal ◮ Distance is at most probability walks don’t meet Theorem If Y is a coupling of two distributions ( X 1 , X 2 ) , then � � X 1 − X 2 � TV � | X 1 ( a ) − X 2 ( a ) | ≤ ( y 1 , y 2 ) ∼ Y [ y 1 � = y 2 ] . Pr a ∈ A

  26. probabilistic Relational Hoare Logic ⊢ { P } c 1 ∼ c 2 { Q } iff there exists µ such that P ( m 1 ⊎ m 2 ) ⇒ µ ◭ Q � � c 1 � m 1 & � c 2 � m 2 � where µ ◭ R � µ 1 & µ 2 � � π 1 ( µ ) = µ 1 ∧ π 2 ( µ ) = µ 2 ∧ supp ( µ ) ⊆ R Fundamental lemma of pRHL If Q � E 1 ⇒ E 2 then Pr ( � c 1 � m 1 ) [ E 1 ] ≤ Pr ( � c 2 � m 2 ) [ E 2 ]

  27. Core rules { Θ } c ′ 1 ∼ c ′ { Φ } c 1 ∼ c 2 { Θ } 2 { Ψ } { Φ } c 1 ; c ′ 1 ∼ c 2 ; c ′ 2 { Ψ } { Φ ∧ ¬ b 1 ∧ ¬ b 2 } c ′ 1 ∼ c ′ { Φ ∧ b 1 ∧ b 2 } c 1 ∼ c 2 { Ψ } 2 { Ψ } { Φ ∧ b 1 = b 2 } if b 1 then c 1 else c ′ 1 ∼ if b 2 then c 2 else c ′ 2 { Ψ } { Φ ∧ b 1 ∧ b 2 } c 1 ∼ c 2 { Φ ∧ b 1 = b 2 } { Φ ∧ b 1 = b 2 } while b 1 do c 1 ∼ while b 2 do c 2 { Φ ∧ ¬ b 1 ∧ ¬ b 2 }

  28. Loops ◮ Benton: same number of iterations ◮ EasyCrypt ( ≤ 2015): one-sided rules ◮ EasyCrypt (2016): asynchronous loop rule = ⇒ relatively complete , subsumes 1 - sided rules Ψ = ⇒ p 0 ⊕ p 1 ⊕ p 2 Ψ ∧ p 0 = ⇒ e 1 ∧ e 2 Ψ ∧ p 1 = ⇒ e 1 Ψ ∧ p 2 = ⇒ e 2 while e 1 ∧ p 1 do c 1 ⇓ while e 2 ∧ p 2 do c 2 { Ψ ∧ p 1 } c 1 ∼ skip { Ψ } { Ψ ∧ p 2 } skip ∼ c 2 { Ψ } { Ψ ∧ p 0 } c 1 ∼ c 2 { Ψ } { Ψ } while e 1 do c 1 ∼ while e 2 do c 2 { Ψ ∧ ¬ e 1 ∧ ¬ e 2 } Example x ← 0 ; i ← 0 ; while i ≤ N do ( x += i ; i ++ ) y ← 0 ; j ← 1 ; while j ≤ N do ( y += j ; j ++ )

  29. Rule for random assignment µ ◭ Q � µ 1 & µ 2 � ⊢ {⊤} x 1 ← µ 1 ∼ x 2 ← µ 2 { Q } $ $ Specialized rule f ∈ T 1 − 1 − → T ∀ v ∈ T . d 1 ( v ) = d 2 ( f v ) ⊢ {∀ v , Q [ v / x 1 , f v / x 2 ] } x 1 ← µ 1 ∼ x 2 ← µ 2 { Q } $ $ Notes ◮ Bijection f : specifies how to coordinate the samples ◮ Side condition: marginals are preserved under f ◮ Assume: samples coupled when proving postcondition Φ

  30. Proofs as (products) programs: xpRHL ◮ Every pRHL derivation yields a product program ◮ Different derivations yield different programs ◮ Can be modelled by a proof system ⊢ { Φ } c 1 ∼ c 2 { Ψ } � c Fundamental lemma of xpRHL ◮ ⊢ { Φ } c 1 ∼ c 2 { Ψ = ⇒ x 1 = x 2 } � c ◮ { � Φ } c { Pr [ ¬ Ψ] ≤ ǫ } implies � ≤ ǫ � � m 1 Φ m 2 ⇒ � Pr ( � c 1 � m 1 ) [ E ( x 1 )] − Pr ( � c 2 � m 2 ) [ E ( x 2 )]

  31. Dynkin’s card trick (shift coupling) p 1 ← s 1 ; p 2 ← s 2 ; l 1 ← [ p 1 ] ; l 2 ← [ p 2 ] ; while n 1 < N ∨ n 2 < N do if p 1 = p 2 then $ n ← ([ 1 , 10 ]) ; p 1 ← p 1 + n ; p 2 ← p 2 + n ; p ← s ; l ← [ p ] ; l 1 ← p 1 :: l 1 ; l 2 ← p 2 :: l 2 ; while p < N do else n ← [ 1 , 10 ] ; $ if p 1 < p 2 then p ← p + n ; n 1 ← [ 1 , 10 ] ; $ l ← p :: l ; p 1 ← p 1 + n 1 ; return p l 1 ← p 1 :: l 1 ; else $ n 2 ← [ 1 , 10 ] ; p 2 ← p 2 + n 2 ; l 2 ← p 2 :: l 2 ; return ( p 1 , p 2 ) Convergence If s 1 , s 2 ∈ [ 1 , 10 ] , and N > 10, then ∆( p final , p final N / 5 − 2 ) ≤ ( 9 / 10 ) 1 2

Recommend


More recommend