encryption
play

Encryption Seny Kamara Microsoft Research Mariana Raykova IBM - PowerPoint PPT Presentation

Parallel Homomorphic Encryption Seny Kamara Microsoft Research Mariana Raykova IBM Research Big Data The scale of data we create is growing rapidly Walmart: 2.5 petabytes of transaction data per day Jets: 10 terabytes of sensor data per


  1. Parallel Homomorphic Encryption Seny Kamara – Microsoft Research Mariana Raykova – IBM Research

  2. Big Data The scale of data we create is growing rapidly Walmart: 2.5 petabytes of transaction data per day Jets: 10 terabytes of sensor data per 30 mins of flight Large Hadron Collider: 40 terabytes per second How do we process this data? Too much for any single machine (even supercomputer) Clusters of machines

  3. Cluster Computing Distribute data Synchronization Fault tolerance Parallel algorithms

  4. MapReduce [Dean-Ghemawat04] A framework Distributed file system Fault tolerance Synchronization A model for parallel computation easy to design parallel algorithms Standard for processing Big Data

  5. MapReduce [Dean-Ghemawat04] MapReduce program Map(k i , v i )  (ik 1 , iv 1 ), …, ( ik t , iv t ) Reduce(ik i , S i )  out i ik, S (ik , iv), …( ik,iv) (k, v) out (ik , iv), …( ik,iv) out (k, v) ik, S

  6. MapReduce [Dean-Ghemawat04] MapReduce algorithm Map(k i , v i )  (ik 1 , iv 1 ), …, ( ik t , iv t ) Reduce(ik i , S i )  out i w 1 , {3, 0} (w 1 , 3 ), …, ( w n , 8) (id, File) W 1 , 3 (w 1 , 0 ), …, ( w n , 3) W 2 , 5 (id, File) w 2 , {4,1}

  7. MapReduce Many MapReduce algorithms IR: counts, searching, sorting, pagerank , HITS, … ML: PCA, neural networks, regression, support vector machines, … Graphs: BFS, DFS, pagerank , minimum spanning tree, …

  8. The Big Data Stack Pig, ... analytics languages HBase, Hive, Hadapt, ... databases (SQL & NoSQL) Hadoop, MapR, Hortonworks, Cloudera, ... MapReduce frameworks Amazon Elastic MapReduce, Azure HDInsight Cloud-based MapReduce

  9. What if I don’t trust the Cloud?

  10. MapReduce on Encrypted Data? Use homomorphic encryption! Client encrypts data Cluster computes homomorphically Question? Can homomorphic evaluation be done in parallel? Can it be done on a standard MapReduce cluster?

  11. Parallel Homomorphic Encryption PHE = (Gen, Enc, Eval, Dec) Gen(1 k ) Enc(K, m) Eval(f, c 1 , …, c n ) ≈ MapReduce algorithm Dec(K, c) PHE = (Gen, Enc, Parse, Map, Reduce, Merge, Dec) Parse(c) generates (encrypted) key-value pairs for mappers Map(k, v) homomorphically evaluates map algorithm Reduce(ik, S) homomorphically evaluates reduce algorithm

  12. Security CPA-security Adversary cannot learn any information about message from ciphertext Note Here single-input security is enough

  13. Constructions

  14. A High-Level Framework PHE = Randomized reductions + homomorphic encryption Randomized reductions [Beaver-Feigenbaum90, Beaver-Feigenbaum-Killian-Rogaway97] (Scatter, Recon) is RR from f to g if s 1 g x s 2 Scatter g Recon f(x) s 3 g

  15. A High-Level Framework g(s 1 ) s 1 g(s 2 ) s 2 Recon f(x) x Scatter s 3 g(s 3 ) Problem #1: cloud operates all workers Problem #2: Recon can be expensive

  16. Solutions Randomized reduction with t = n Univariate polynomials Multivariate polynomials Outsource Recon Simple enough to be evaluated with single multiplication

  17. Reduction for Univariate Polynomials Scatter q (x) Set n = 2q+1 n (all distinct) Sample α = ( α 1 , …, α n ) at random in F q Choose degree-2 permutation polynomial P x such that P x (0) = x a ¬ ¾ ¾ $ Set s = (s 1 , …, s n ) = (P x (α 1 ), …, P x ( α n )) Output s and st = α Recon q (st, y 1 , …, y n ) Interpolate Q through points (α 1 , y 1 ), …, ( α n , y n ) Output Q(0)

  18. Reduction for Univariate Polynomials Correctness Secret sharing is “ homomorphic ” Interpolation of Q(p x ( α 1 )), …, Q( p x ( α n )) at 0 results in Q(p x (0)) = Q(x) Security Sharing polynomials are permutations Evaluation points α i are uniform Shares are independent of secret

  19. A General MR-Parallel HE Scheme 1 2 3 4 5 … … st Scatter s 1 … s 3 s 1 , Enc(st) s 3 , Enc(st)

  20. A General MR-Parallel HE Scheme Mappers 1 2 3 4 5 3, [ Enc(g(s 1 )), Enc(st) ] 3, [ s 1 , Enc(st) ] … … st Scatter … 3, [ Enc(g(s 2 )), Enc(st) ] 3, [ s 2 , Enc(st) ] s 1 … s 3 … s 1 , Enc(st) s 3 , Enc(st) 1, [ Enc(g(s 1 )), Enc(st) ] 1, [ s 1 , Enc(st) ]

  21. A General MR-Parallel HE Scheme Reducers 3, [ Enc(g(s 1 )), Enc(st), 3, [ Enc(g(s 1 )), Enc(st) ] 3, [ s 1 , Enc(st) ] Enc(g(s 2 )), Enc(st), Enc(g(s 3 )), Enc(st) ] … 3, [ Enc(g(s 2 )), Enc(st) ] 3, [ s 2 , Enc(st) ] 3, Enc( Recon(st, g(s 1 ), g(s 2 ), g(s 3 )) ) … … … 1, [ Enc(g(s 1 )), Enc(st) ] 1, [ s 1 , Enc(st) ]

  22. Additional Results Randomized reduction for multivariate polynomials for small number of variables based on multi-dimensional noisy curve reconstruction assumption from [Ishai-Kushilevitz-Ostrovsky-Sahai06] More efficient direct MR-PHE constructions Univariate polynomials Multivariate polynomials Applications Database search (e.g, keyword search, OR queries)

  23. Thanks!

Recommend


More recommend