differential privacy and applications
play

Differential Privacy and Applications Marco Gaboardi Boston - PowerPoint PPT Presentation

Differential Privacy and Applications Marco Gaboardi Boston University Recap Foundamental Law of Information Reconstruction The release of too many overly accurate statistics gives privacy violations. ( , )-Differential Privacy


  1. Differential Privacy and Applications Marco Gaboardi Boston University

  2. Recap

  3. Foundamental Law of Information Reconstruction The release of too many overly accurate statistics gives privacy violations.

  4. ( ε , δ )-Differential Privacy Definition Given ε , δ ≥ 0, a probabilistic query Q: X n → R is ( ε , δ )-differentially private iff for all adjacent database b 1 , b 2 and for every S ⊆ R: Pr[Q(b 1 ) ∈ S] ≤ exp( ε )Pr[Q(b 2 ) ∈ S] + δ

  5. Laplace Mechanism Algorithm 2 Pseudo-code for the Laplace Mechanism 1: function LapMech ( D, q, ‘ ) $ ← Lap ( ∆ q ‘ )(0) Y 2: return q ( D ) + Y 3: 4: end function

  6. Laplace Mechanism Theorem (Privacy of the Laplace Mechanism) 
 The Laplace mechanism is ε -differentially private. ∀ ∈ Accuracy Theorem: 
 and let r = LapMech ( D, q, ‘ ) . T 1 1 1 ∆ q Ë 2 2È Pr | q ( D ) − r | ≥ ln = — ‘ —

  7. 7 Sequential Composition Q 1 is ε 1 -DP Noise Q 2 is ε 2 -DP D … Q n is ε n -DP The overall process is ( ε 1 + ε 2 +…+ ε n )-DP

  8. 8 Parallel Composition Noise D Q i s ε - D P D ⊎ D’ Noise P D - ε s D’ i ’ Q The overall process is ε -DP

  9. PINQ - McSharry’08 • Private LINQ (a library/API for queries in C#) • Designed with composition in mind. • The first language for differential privacy.

  10. An alternative approach: 
 Fuzz: Compositional Reasoning about Sensitivity (Pierce et al.’10) • Based on a semantics model of metric spaces and non- expansive functions, • The user specifies the sensitivity of some basic primitives based on the semantics model, • The tool implements a type-checker permitting a static checking of the sensitivity of a program (based on a calculus for sensitivities derived from linear logic), • It requires a limited reasoning about probabilities.

  11. Verification tools Do we have good 
 + semi-decision 
 expert provided 
 procedures for 
 annotations ( ε , 𝜺 )-indistinguishability? verification 
 (semi)-decision procedures 
 tools (SMT solvers, ITP)

  12. Approximate Probabilistic Coupling A ( ε , δ )-coupling 𝜈 1 C ( ε , δ ) (S) 𝜈 2 of two probability distributions 𝜈 1 over A and 𝜈 2 over B with respect to the relation S ⊆ AxB is a pair of probability distribution 𝜈 L , 𝜈 R over A x B such that: 1.the left marginal of 𝜈 L is 𝜈 1, the right marginal of 𝜈 R is 𝜈 2 , 2.the support of 𝜈 L and 𝜈 R is contained in S , 3.max(max E 𝜈 L (E)-exp( ε ) 𝜈 R (E), max E 𝜈 R (E)-exp( ε ) 𝜈 L (E)) ≤δ [Barthe et al. 12]

  13. We will use a simplification Precondition Privacy (a relation over memories) Parameters ⊢ ϵ , δ c : P ⇒ Q Postcondition Program (a relation over memories)

  14. Approximate Probabilistic Coupling for DP Q is ( ε , δ )-differentially private 
 iff Q(D) C ( ε , δ ) (=) Q(D’) For D and D’ differing in one individual.

  15. Example of coupling Pre: 0 ≤ k+input 1 -input 2 ≤ k’ output = input + Lap(1/ ε ) Post: [output1+k=output2] we pay k’ ε

  16. 16 Report Noisy Max (a) (b) (c) (d) Suppose that each one of us can vote for one star, and we want to say who is the star that receives most votes.

  17. 17 Report Noisy Max (a) (b) (c) (d) 0.5 Algorithm: We can compute the 0.375 histogram add Laplace 0.25 noise to each score and then select the maximal noised score. 0.125 We can even add 0 one side Laplace (a) (b) (c) (d)

  18. Report Noisy Max - intuition 1 sensitive queries We need to coordinate Noises 9 q 1 (D)+noise q 1 (D’)+noise q 2 (D)+noise q 2 (D’)+noise 6.75 q 3 (D)+noise q 3 (D’)+noise 4.5 ..... ….. 2.25 q k (D)+noise q k (D’)+noise 0 -2.25 Q1 Q2 Q3 Q4 Q5 Q6 Q7 Q8 D D’ We can prove this algorithm ε -differentially private Databases differing in one individual

  19. Report One-sided Noisy Max Instead of the classic Report Noisy Max, we consider a version where we add noise from a one-sided Laplace k : list data � R, ROSNM (q 1 ,…,q b : list data, ε : R) : nat i = 1; Composition doesn’t best = 0; apply, since adding one- while (i ≤ k){ sided Laplace is not i (b) + Lap+(2/ ε ) cur = q differentially private if (cur > best \/ i=1) max = i ; best = cur; return max;

  20. Pointwise rule - simplified If for every s ∈ O Pre: b 1 ~1 b 2 program Post: [out1=s => out2=s] and paid ε then Pre: formula Program 2 ] and paid ε Post: [out 1 =out

  21. Report One-sided Noisy Max [b 1 ~1 b 2 , ∀ i. ∀ d 1 ~1d 2. |q i (d 1 )-q i (d 2 )| ≤ 1,…] k : list data � R, ROSNM (q 1 ,…,q b : list data, ε : R) : nat i = 1; best = 0; while (i ≤ k){ i (b) + Lap+(2/ ε ) cur = q if (cur > best \/ i=1) max = i ; best = cur; i=i+1; return max; 2 ] and paid ε [max 1 =max

  22. Report One-sided Noisy Max [b 1 ~1 b 2 , ∀ i. ∀ d 1 ~1d 2. |q i (d 1 )-q i (d 2 )| ≤ 1,…] k : list data � R, ROSNM (q 1 ,…,q b : list data, ε : R) : nat By applying the i = 1; best = 0; pointwise rule while (i ≤ k){ i (b) + Lap+(2/ ε ) cur = q we get a different post if (cur > best \/ i=1) max = i ; best = cur; Notice that we focus i=i+1; on a single general s . return max; 2 =s] and paid ε [max 1 =s => max

  23. Report One-sided Noisy Max k : list data � R, ROSNM (q 1 ,…,q b : list data, ε : R) : nat [b 1 ~1 b 2 , ∀ i. ∀ d 1 ~1d 2. |q i (d 1 )-q i (d 2 )| ≤ 1,…] i = 1; best = 0; while (i ≤ k){ i (b) + Lap+(2/ ε ) cur = q if (cur > best \/ i=1) max = i ; best = cur; i=i+1; return max; Playing the 2 =s] and paid ε [max 1 =s => max verification game

  24. Report One-sided Noisy Max k : list data � R, ROSNM (q 1 ,…,q b : list data, ε : R) : nat i = 1; best = 0; [b 1 ~1 b 2 ,…] while (i ≤ k){ i (b) + Lap+(2/ ε ) cur = q if (cur > best \/ i=1) max = i ; best = cur; i=i+1; return max; Playing the 2 =s] and paid ε [max 1 =s => max verification game

  25. Report One-sided Noisy Max k : list data � R, ROSNM (q 1 ,…,q b : list data, ε : R) : nat i = 1; best = 0; while (i ≤ k){ [b 1 ~1 b 2 ,…] i (b) + Lap+(2/ ε ) cur = q if (cur > best \/ i=1) max = i ; best = cur; i=i+1; return max; We can now proceed 2 =s] and paid ε [max 1 =s => max by cases

  26. Report One-sided Noisy Max k : list data � R, ROSNM (q 1 ,…,q b : list data, ε : R) : nat i = 1; best = 0; while (i ≤ k){ [b 1 ~1 b 2 , i 1 <s => … /\ i 1 ≥ s => … /\ i 1 =i 2 ] i (b) + Lap+(2/ ε ) cur = q if (cur > best \/ i=1) max = i ; best = cur; i=i+1; return max; And use different 2 =s] and paid ε [max 1 =s => max properties

  27. Invariant 1 < s => ( 2 | ≤ 1 ) i max 1 < s /\ max 2 < s /\ |best 1 -best 1 ≥ s => ( 1 ≠ s ) /\ i (max 1 = max 2 =s /\ best 1 +1=best 2 ) \/ max /\ i 1 =i 2

  28. Invariant This part describes the situation before we encounter s. 1 < s => ( 2 | ≤ 1 ) i max 1 < s /\ max 2 < s /\ |best 1 -best 1 ≥ s => ( 1 ≠ s ) /\ i (max 1 = max 2 =s /\ best 1 +1=best 2 ) \/ max /\ i 1 =i 2

  29. Invariant 1 < s => ( 2 | ≤ 1 ) i max 1 < s /\ max 2 < s /\ |best 1 -best 1 ≥ s => ( 1 ≠ s ) /\ i (max 1 = max 2 =s /\ best 1 +1=best 2 ) \/ max /\ i 1 =i 2 This part describes the situation after we encounter s.

  30. Invariant 1 < s => ( 2 | ≤ 1 ) i max 1 < s /\ max 2 < s /\ |best 1 -best 1 ≥ s => ( 1 ≠ s ) /\ i (max 1 = max 2 =s /\ best 1 +1=best 2 ) \/ max /\ i 1 =i 2 When we encounter s we switch from one to the other

  31. Report One-sided Noisy Max k : list data � R, ROSNM (q 1 ,…,q b : list data, ε : R) : nat i = 1; best = 0; while (i ≤ k){ [i 1 < s => max 1 < s /\ max 2 < s /\ |best 1 -best 2 | ≤ 1] i (b) + Lap+(2/ ε ) cur = q if (cur > best \/ i=1) max = i ; best = cur; i=i+1; return max; Let us consider case 2 =s] and paid ε [max 1 =s => max by case

  32. Report One-sided Noisy Max k : list data � R, ROSNM (q 1 ,…,q b : list data, ε : R) : nat i = 1; best = 0; while (i ≤ k){ [i 1 < s => max 1 < s /\ max 2 < s /\ |best 1 -best 2 | ≤ 1] i (b) + Lap+(2/ ε ) cur = q if (cur > best \/ i=1) max = i ; best = cur; i=i+1; return max; Which rule shall 2 =s] and paid ε [max 1 =s => max we apply?

  33. Laplace+ rule 1 Pre: true output = input + Lap+( ε ) Post: [output1-output2=input1-input2] we paid 0

  34. Report One-sided Noisy Max k : list data � R, ROSNM (q 1 ,…,q b : list data, ε : R) : nat i = 1; best = 0; while (i ≤ k){ i (b) + Lap+(2/ ε ) cur = q [i 1 < s => max 1 < s /\ max 2 < s /\ |best 1 -best 2 | ≤ 1/\ cur1-cur2=q i (b)-q i (b)] paid 0 if (cur > best \/ i=1) max = i ; best = cur; i=i+1; return max; Let’s apply the rule 2 =s] and paid ε [max 1 =s => max

  35. Report One-sided Noisy Max k : list data � R, ROSNM (q 1 ,…,q b : list data, ε : R) : nat i = 1; best = 0; while (i ≤ k){ i (b) + Lap+(2/ ε ) cur = q [i 1 < s => max 1 < s /\ max 2 < s /\ |best 1 -best 2 | ≤ 1/\ |cur1-cur2| ≤ 1] paid 0 if (cur > best \/ i=1) max = i ; best = cur; i=i+1; return max; And rewrite… 2 =s] and paid ε [max 1 =s => max

Recommend


More recommend