Computer-Aided Privacy Proofs C´ esar Kunz Joint work with Gilles Barthe, Benjamin Gr´ egoire, Santiago Zanella-B´ eguelin 2012.07.10 Provable Privacy Workshop 2012
Privacy for Statistical Databases
Privacy for Statistical Databases Maximize Privacy
Privacy for Statistical Databases Maximize Privacy Maximize Utility Conflicting requirements Sanitizing queries requires to strike a good balance
Differential Privacy [Dwork et al. 06] K Fix a (symmetric) adjacency relation Φ on databases Fix a privacy budget ǫ A randomized algorithm K : D → R (called mechanism) is ǫ -differentially private iff for all D 1 , D 2 s.t. Φ( D 1 , D 2 ) ∀ S ⊆ R . Pr [ K ( D 1 ) ∈ S ] ≤ exp( ǫ ) × Pr [ K ( D 2 ) ∈ S ]
Differential Privacy [Dwork et al. 06] K Fix a (symmetric) adjacency relation Φ on databases Fix a privacy budget ǫ A randomized algorithm K : D → R (called mechanism) is ( ǫ, δ ) -differentially private iff for all D 1 , D 2 s.t. Φ( D 1 , D 2 ) ∀ S ⊆ R . Pr [ K ( D 1 ) ∈ S ] ≤ exp( ǫ ) × Pr [ K ( D 2 ) ∈ S ] + δ Still an information-theoretic definition
Achieving Differential-Privacy Consider a numerical query f : D → R Define the sensitivity of f as def ∆( f ) D 1 , D 2 | Φ( D 1 , D 2 ) | f ( D 1 ) − f ( D 2 ) | max = def The mechanism K ( D ) = f ( D ) + Lap(∆( f ) /ǫ ) is ǫ -differentially private Pr [ K ( D ) = x ] ∝ exp( −| f ( D ) − x | ǫ/ ∆( f )) The Exponential Mechanism generalizes this to arbitrary domains
Computer-Aided Crypto Proofs Game G 0 : Game G 1 : Game G n : . . . . . . . . . · · · . . . ← A ( . . . ); . . . . . . ← B ( . . . ) . . . . . . . . . Pr [G 0 : E 0 ] ≤ h 1 ( Pr [G 1 : E 1 ]) ≤ . . . ≤ h n ( Pr [G n : E n ]) Computer-aided (computational) crypto proofs, a success story CertiCrypt/EasyCrypt provers Many examples: Cramer-Shoup, OAEP, FDH, ZK-PoK, Boneh-Franklin IBE, Merkle-Damg˚ ard, ZAEP, AKE ... Best paper at CRYPTO’11 Q: Can we extend these techniques to reason about privacy? A: Yes, in this talk we will see how
Computer-Aided Crypto Proofs Game G 0 : Game G 1 : Game G n : . . . . . . . . . · · · . . . ← A ( . . . ); . . . . . . ← B ( . . . ) . . . . . . . . . Pr [G 0 : E 0 ] ≤ h 1 ( Pr [G 1 : E 1 ]) ≤ . . . ≤ h n ( Pr [G n : E n ]) Computer-aided (computational) crypto proofs, a success story CertiCrypt/EasyCrypt provers Many examples: Cramer-Shoup, OAEP, FDH, ZK-PoK, Boneh-Franklin IBE, Merkle-Damg˚ ard, ZAEP, AKE ... Best paper at CRYPTO’11 Q: Can we extend these techniques to reason about privacy? A: Yes, in this talk we will see how
Probabilistic While Language C ::= skip nop | C ; C sequence | V ← E assignment | V ← D random sampling $ | if E then C else C conditional | while E do C while loop | V ← P ( E , . . . , E ) procedure call x ← d : sample the value of x according to distribution d $ The denotation of a program c is a function from an initial state to a (sub-) distribution over final states: � c � : M → Distr( M ) Programs that do not terminate absolutely generate sub-distributions with total probability mass < 1
Relational Hoare Logic Hoare Logic Judgments: { P } c { Q } Assertions: P , Q are predicates over program states Validity: if ( c , m ) ⇓ m ′ and m � P , then m ′ � Q Relational Hoare Logic (RHL) Judgments: ⊢ c 1 ∼ c 2 : P ⇒ Q Assertions: P , Q are relations over program states Validity: if ( c 1 , m 1 ) ⇓ m ′ 1 and ( c 2 , m 2 ) ⇓ m ′ 2 , and ( m 1 , m 2 ) � P , then ( m ′ 1 , m ′ 2 ) � Q
Relational Hoare Logic Hoare Logic Judgments: { P } c { Q } Assertions: P , Q are predicates over program states Validity: if ( c , m ) ⇓ m ′ and m � P , then m ′ � Q Relational Hoare Logic (RHL) Judgments: ⊢ c 1 ∼ c 2 : P ⇒ Q Assertions: P , Q are relations over program states Validity: if ( c 1 , m 1 ) ⇓ m ′ 1 and ( c 2 , m 2 ) ⇓ m ′ 2 , and ( m 1 , m 2 ) � P , then ( m ′ 1 , m ′ 2 ) � Q
Probabilistic Relational Hoare Logic (pRHL) Judgments: ⊢ c 1 ∼ c 2 : P ⇒ Q where P , Q are binary relations over states (like in the deterministic case) Validity: if ( m 1 , m 2 ) � P , then ( � c 1 � m 1 , � c 2 � m 2 ) � Q ♯ where Q ♯ is the lifting of Q to a relation over distributions. Inequalities about probabilities can be inferred from valid pRHL judgments: If ( m 1 , m 2 ) � P , and ( m ′ 1 , m ′ 2 ) � Q implies m ′ ⇒ m ′ 1 � A = 2 � B , then Pr [ c 1 , m 1 : A ] ≤ Pr [ c 2 , m 2 : B ] Other forms of inequalities can be captured using relational logic (e.g. Fundamental Lemma)
Approximate Probabilistic Relational Hoare Logic Judgments: ⊢ c 1 ∼ α,δ c 2 : P ⇒ Q Validity: Requires a novel generalization of the lifting of pRHL What can be inferred about a valid judgment? If ( m 1 , m 2 ) � P and ( m ′ 1 , m ′ 2 ) � Q implies m ′ ⇒ m ′ 1 � A = 2 � B , then Pr [ c 1 , m 1 : A ] ≤ α × Pr [ c 2 , m 2 : B ] + δ Exactly what we need to encode DP!
Approximate Probabilistic Relational Hoare Logic Judgments: ⊢ c 1 ∼ α,δ c 2 : P ⇒ Q Validity: Requires a novel generalization of the lifting of pRHL What can be inferred about a valid judgment? If ( m 1 , m 2 ) � P and ( m ′ 1 , m ′ 2 ) � Q implies m ′ ⇒ m ′ 1 � A = 2 � B , then Pr [ c 1 , m 1 : A ] ≤ α × Pr [ c 2 , m 2 : B ] + δ Exactly what we need to encode DP!
Example: Private 2-Party Computation Two hospitals hold record of some patient’s recent blood tests Want to check whether the patient is asking for similar tests John Doe John Doe LDL 1 LDL 0 HDL 1 HDL 0 HIV 0 HIV 1 GLU 1 GLU 1 LEU 1 LEU 1
Example: Private 2-Party Computation Two hospitals hold record of some patient’s recent blood tests Want to check whether the patient is asking for similar tests . . . a ,� h ( � b ) John Doe John Doe LDL 1 LDL 0 HDL 1 HDL 0 HIV 0 HIV 1 GLU 1 GLU 1 LEU 1 LEU 1 � � a b
Example: Private 2-Party Computation Using additive homomorphic encryption (e.g. Paillier)
Example: Private 2-Party Computation Using additive homomorphic encryption (e.g. Paillier) E ( b i ) i = 1 . . . n
Example: Private 2-Party Computation Using additive homomorphic encryption (e.g. Paillier) E ( b i ) i = 1 . . . n a i ? c i ← E ( b i ) : c i ← E ( b i ) h A ← � i c i + noise A h A
Example: Private 2-Party Computation Using additive homomorphic encryption (e.g. Paillier) E ( b i ) i = 1 . . . n a i ? c i ← E ( b i ) : c i ← E ( b i ) h A ← � i c i + noise A h A ˜ h A ← D ( h A ) (= h ( a , b ) + noise A ) h B ← ˜ h A + noise B h B ˜ h B ← h B − noise A (= h ( a , b ) + noise B )
Private Hamming Distance Computation Proofs in the semi-honest model (honest but curious) Some subtleties ignored: truncated noise Theorem If the underlying homomorphic scheme is IND-CPA secure, then the protocol is ǫ - SIM-CDP w.r.t. party A when noise B = Lap(1 /ǫ ) The protocol is ǫ - DP w.r.t. party B when noise A = Lap(1 /ǫ )
EasyCrypt: Automated crypto (and privacy) proofs ProofGeneral Frontend Emacs Shell EasyCrypt Toplevel Why3 API Why3 Software Verification Platform SMT Solvers Automated Provers Interactive Provers Alt-Ergo Vampire Coq CVC3 E-Prover Z3 SPASS Yices http://easycrypt.gforge.inria.fr
Recommend
More recommend