Relational reasoning via probabilistic coupling Gilles Barthe, Thomas Espitau, Benjamin Grégoire, Justin Hsu, Léo Stefanesco, Pierre-Yves Strub IMDEA Software, ENS Cachan, ENS Lyon, Inria, University of Pennsylvania November 28, 2015 1
Relational properties Properties about two runs of the same program ◮ Assume inputs are related by Ψ ◮ Want to prove the outputs are related by Φ 2
Examples Monotonicity ◮ Ψ : in 1 ≤ in 2 ◮ Φ : out 1 ≤ out 2 ◮ “Bigger inputs give bigger outputs” 3
Examples Monotonicity ◮ Ψ : in 1 ≤ in 2 ◮ Φ : out 1 ≤ out 2 ◮ “Bigger inputs give bigger outputs” Non-interference ◮ Ψ : low 1 = low 2 ◮ Φ : out 1 = out 2 ◮ “If low-security inputs are the same, then outputs are the same” 3
Probabilistic relational properties Richer properties ◮ Differential privacy ◮ Cryptographic indistinguishability 4
Probabilistic relational properties Richer properties ◮ Differential privacy ◮ Cryptographic indistinguishability Verification tool: pRHL [BGZ-B] ◮ Imperative while language + command for random sampling ◮ Deterministic input, randomized output ◮ Hoare-style logic 4
Inspiration from probability theory Probabilistic couplings ◮ Used by mathematicians for proving relational properties ◮ Applications: Markov chains, probabilistic processes Idea ◮ Place two processes in the same probability space ◮ Coordinate the sampling 5
Our results Main observation The logic pRHL internalizes coupling 6
Our results Main observation The logic pRHL internalizes coupling Consequences ◮ Constructing pRHL proof → constructing a coupling ◮ Can verify classic examples of couplings in mathematics with proof assistant EasyCrypt (built on pRHL) 6
The plan Today ◮ Introducing probabilistic couplings ◮ Introducing the relational logic pRHL ◮ Example: convergence of random walks 7
Probabilistic couplings 8
Introducing to probabilistic couplings Basic ingredients ◮ Given: two distributions X 1 , X 2 over set A ◮ Produce: joint distribution Y over A × A – Distribution over the first component is X 1 – Distribution over the second component is X 2 9
Introducing to probabilistic couplings Basic ingredients ◮ Given: two distributions X 1 , X 2 over set A ◮ Produce: joint distribution Y over A × A – Distribution over the first component is X 1 – Distribution over the second component is X 2 Definition Given two distributions X 1 , X 2 over a set A , a coupling Y is a distribution over A × A such that π 1 ( Y ) = X 1 and π 2 ( Y ) = X 2 . 9
1/2 1/2 Example: mirrored random walks Simple random walk on integers ◮ Start at position p = 0 ◮ Each step, flip coin x $ ← flip ◮ Heads: p ← p + 1 ◮ Tails: p ← p − 1 10
Example: mirrored random walks Simple random walk on integers ◮ Start at position p = 0 ◮ Each step, flip coin x $ ← flip ◮ Heads: p ← p + 1 ◮ Tails: p ← p − 1 1/2 1/2 Figure: Simple random walk 10
Coupling the walks to meet Case p 1 = p 2 : Walks have met ◮ Arrange samplings x 1 = x 2 ◮ Continue to have p 1 = p 2 11
Coupling the walks to meet Case p 1 = p 2 : Walks have met ◮ Arrange samplings x 1 = x 2 ◮ Continue to have p 1 = p 2 Case p 1 � = p 2 : Walks have not met ◮ Arrange samplings x 1 = ¬ x 2 ◮ Walks make mirror moves 11
Coupling the walks to meet Case p 1 = p 2 : Walks have met ◮ Arrange samplings x 1 = x 2 ◮ Continue to have p 1 = p 2 Case p 1 � = p 2 : Walks have not met ◮ Arrange samplings x 1 = ¬ x 2 ◮ Walks make mirror moves Under coupling, if walks meet, they move together 11
Why is this interesting? Goal: memorylessness ◮ Start two random walks at w and w + 2 k ◮ To show: position distributions converge as we take more steps 12
Why is this interesting? Goal: memorylessness ◮ Start two random walks at w and w + 2 k ◮ To show: position distributions converge as we take more steps Coupling bounds distance between distributions ◮ Once walks meet, they stay equal ◮ Distance is at most probability walks don’t meet 12
Why is this interesting? Goal: memorylessness ◮ Start two random walks at w and w + 2 k ◮ To show: position distributions converge as we take more steps Coupling bounds distance between distributions ◮ Once walks meet, they stay equal ◮ Distance is at most probability walks don’t meet Theorem If Y is a coupling of two distributions ( X 1 , X 2 ) , then � � X 1 − X 2 � TV � | X 1 ( a ) − X 2 ( a ) | ≤ ( y 1 , y 2 ) ∼ Y [ y 1 � = y 2 ] . Pr a ∈ A 12
The logic pRHL 13
The program logic pRHL Probabilistic Relational Hoare Logic ◮ Hoare-style logic for probabilistic relational properties ◮ Proposed by Barthe, Grégoire, Zanella-Béguelin ◮ Implemented in the EasyCrypt proof assistant for crypto proofs 14
Language and judgments The pWhile imperative language c ::= x ← e | x ← d | if e then c else c | while e do c | skip | c ; c $ 15
Language and judgments The pWhile imperative language c ::= x ← e | x ← d | if e then c else c | while e do c | skip | c ; c $ 15
Language and judgments The pWhile imperative language c ::= x ← e | x ← d | if e then c else c | while e do c | skip | c ; c $ Basic pRHL judgments � c 1 ∼ c 2 : Ψ ⇒ Φ ◮ Ψ and Φ are formulas over labeled program variables x 1 , x 2 ◮ Ψ is precondition, Φ is postcondition 15
Interpreting the judgment � c 1 ∼ c 2 : Ψ ⇒ Φ 16
Interpreting the judgment � c 1 ∼ c 2 : Ψ ⇒ Φ Interpreting pre- and post-conditions ◮ Ψ interpreted as a relation on two memories ◮ Φ interpreted as a relation Φ † on distributions over memories 16
Interpreting the judgment � c 1 ∼ c 2 : Ψ ⇒ Φ Interpreting pre- and post-conditions ◮ Ψ interpreted as a relation on two memories ◮ Φ interpreted as a relation Φ † on distributions over memories Definition (Couplings in disguise!) If Φ is a relation on A , the lifted relation Φ † is a relation on Distr ( A ) where µ 1 Φ † µ 2 if there exists µ ∈ Distr ( A × A ) with ◮ supp( µ ) ⊆ Φ; and ◮ π 1 ( µ ) = µ 1 and π 2 ( µ ) = µ 2 . 16
Proof rules The key rule: Sampling f ∈ T 1 − 1 − → T ∀ v ∈ T . d 1 ( v ) = d 2 ( f v ) Sample � x 1 ← d 1 ∼ x 2 ← d 2 : ∀ v , Φ[ v / x 1 , f ( v ) / x 2 ] ⇒ Φ $ $ Notes 17
Proof rules The key rule: Sampling f ∈ T 1 − 1 − → T ∀ v ∈ T . d 1 ( v ) = d 2 ( f v ) Sample � x 1 ← d 1 ∼ x 2 ← d 2 : ∀ v , Φ[ v / x 1 , f ( v ) / x 2 ] ⇒ Φ $ $ Notes 17
Proof rules The key rule: Sampling f ∈ T 1 − 1 − → T ∀ v ∈ T . d 1 ( v ) = d 2 ( f v ) Sample � x 1 ← d 1 ∼ x 2 ← d 2 : ∀ v , Φ[ v / x 1 , f ( v ) / x 2 ] ⇒ Φ $ $ Notes ◮ Bijection f : specifies how to coordinate the samples 17
Proof rules The key rule: Sampling f ∈ T 1 − 1 − → T ∀ v ∈ T . d 1 ( v ) = d 2 ( f v ) Sample � x 1 ← d 1 ∼ x 2 ← d 2 : ∀ v , Φ[ v / x 1 , f ( v ) / x 2 ] ⇒ Φ $ $ Notes ◮ Bijection f : specifies how to coordinate the samples 17
Proof rules The key rule: Sampling f ∈ T 1 − 1 − → T ∀ v ∈ T . d 1 ( v ) = d 2 ( f v ) Sample � x 1 ← d 1 ∼ x 2 ← d 2 : ∀ v , Φ[ v / x 1 , f ( v ) / x 2 ] ⇒ Φ $ $ Notes ◮ Bijection f : specifies how to coordinate the samples ◮ Side condition: marginals are preserved under f 17
Proof rules The key rule: Sampling f ∈ T 1 − 1 − → T ∀ v ∈ T . d 1 ( v ) = d 2 ( f v ) Sample � x 1 ← d 1 ∼ x 2 ← d 2 : ∀ v , Φ[ v / x 1 , f ( v ) / x 2 ] ⇒ Φ $ $ Notes ◮ Bijection f : specifies how to coordinate the samples ◮ Side condition: marginals are preserved under f 17
Proof rules The key rule: Sampling f ∈ T 1 − 1 − → T ∀ v ∈ T . d 1 ( v ) = d 2 ( f v ) Sample � x 1 ← d 1 ∼ x 2 ← d 2 : ∀ v , Φ[ v / x 1 , f ( v ) / x 2 ] ⇒ Φ $ $ Notes ◮ Bijection f : specifies how to coordinate the samples ◮ Side condition: marginals are preserved under f ◮ Assume: samples coupled when proving postcondition Φ 17
Examples 18
Example: mirroring random walks in pRHL The code pos ← start; // Start position i ← 0; H ← []; // Ghost code while i < N do $ b ← flip; H ← b :: H; // Ghost code if b then pos ← pos + 1; else pos ← pos - 1; fi i ← i + 1; end return pos // Final position 19
Example: mirroring random walks in pRHL The code pos ← start; // Start position i ← 0; H ← []; // Ghost code while i < N do $ b ← flip; H ← b :: H; // Ghost code if b then pos ← pos + 1; else pos ← pos - 1; fi i ← i + 1; end return pos // Final position Goal: couple two walks via mirroring 19
Record the history H stores history of flips ◮ Σ( H ) is the net distance that the first process moves to the right ◮ Meet ( H ) if there is prefix H’ of H with Σ( H’ ) = k 20
Specify the coupling Sampling rule f ∈ T 1 − 1 − → T ∀ v ∈ T . d 1 ( v ) = d 2 ( f v ) Sample � x 1 ← d 1 ∼ x 2 ← d 2 : ∀ v , Φ[ v / x 1 , f ( v ) / x 2 ] ⇒ Φ $ $ 21
Recommend
More recommend