collaborators arash afshar zhangxiang hu payman mohassel
play

. Collaborators: Arash Afshar / Zhangxiang Hu / Payman Mohassel .. - PowerPoint PPT Presentation

Secure Computation with Sublinear Cost Mike Rosulek . Collaborators: Arash Afshar / Zhangxiang Hu / Payman Mohassel .. .. .. .. .. .. .. . . . . . . f x y f x y Examples: Run proprietary classifier x on private data y Evaluate


  1. Secure Computation with Sublinear Cost Mike Rosulek . Collaborators: Arash Afshar / Zhangxiang Hu / Payman Mohassel .. .. .. .. .. .. .. . . .

  2. . . . f x y f x y Examples: Run proprietary classifier x on private data y Evaluate statistics on combined medical records x & y Secure 2-party computation y x . . . . .. .. .. .. .. .. .. . . .

  3. . . . f x y f x y Examples: Run proprietary classifier x on private data y Evaluate statistics on combined medical records x & y Secure 2-party computation y x . . . . .. .. .. .. .. .. .. . . .

  4. . Examples: Run proprietary classifier x on private data y Evaluate statistics on combined medical records x & y Secure 2-party computation y x . . . . . . f ( x , y ) f ( x , y ) .. .. .. .. .. .. .. . . .

  5. . . f x y Examples: Run proprietary classifier x on private data y Evaluate statistics on combined medical records x & y Secure 2-party computation y x . . . . . . f ( x , y ) .. .. .. .. .. .. .. . . .

  6. . . f x y Secure 2-party computation y x . . . . . . f ( x , y ) Examples: ▶ Run proprietary classifier x on private data y ▶ Evaluate statistics on combined medical records x & y ▶ · · · .. .. .. .. .. .. .. . . . .

  7. f x y doesn’t depend on these bits of y protocol never touches these y . . . . . . f x y Example: y genetic database x DNA markers f x y diagnosis in general , security demands that all of the data is touched Fundamental Limits y x . . . . . .. .. .. .. .. .. .. . . . .

  8. f x y doesn’t depend on these bits of y . . Example: y genetic database x DNA markers f x y diagnosis in general , security demands that all of the data is touched Fundamental Limits protocol never touches these y y x . . . . . . . . f ( x , y ) .. .. .. .. .. .. .. . . . .

  9. protocol never touches these . . in general , security demands that all of the data is touched Fundamental Limits f ( x , y ) doesn’t depend on these bits of y y y x . . . . . . . f ( x , y ) Example: ▶ y = genetic database ▶ x = DNA markers ▶ f ( x , y ) = diagnosis .. .. .. .. .. .. .. . . . .

  10. protocol never touches these . . Fundamental Limits f ( x , y ) doesn’t depend on these bits of y y y x . . . . . . . f ( x , y ) Example: ▶ y = genetic database ▶ x = DNA markers ▶ f ( x , y ) = diagnosis ⇒ in general , security demands that all of the data is touched .. .. .. .. .. .. .. . . . .

  11. . “to securely evaluate f, first express f as a boolean circuit, then ...” Limits of Standard Techniques . . .. .. .. .. .. .. .. . . . .

  12. “to securely evaluate f, first express f as a boolean circuit, then ...” Limits of Standard Techniques . . .. .. .. .. .. .. .. . . . .

  13. Limits of Standard Techniques . . “to securely evaluate f, first express f as a boolean circuit, then ...” .. .. .. .. .. .. .. . . . .

  14. . Limits of Standard Techniques . . “to securely evaluate f, first express f as a boolean circuit, then ...” .. .. .. .. .. .. .. . . . .

  15. 2: General-purpose 2PC scales with size of circuit representa- tion , which is always at least linear in input size. What We’re Up Against 1: . Security requires protocol cost at least linear in size of in- puts (in general!) .. .. .. .. .. .. .. . . . .

  16. What We’re Up Against 1: . Security requires protocol cost at least linear in size of in- puts (in general!) 2: General-purpose 2PC scales with size of circuit representa- tion , which is always at least linear in input size. .. .. .. .. .. .. .. . . .

  17. 2: Protocol must “touch every bit”, but amortize this cost across many executions. In this talk: 1: . Instead of circuits, use a representation that can actually be sublinear in size. .. .. .. .. .. .. .. . . . .

  18. In this talk: 1: . Instead of circuits, use a representation that can actually be sublinear in size. 2: Protocol must “touch every bit”, but amortize this cost across many executions. .. .. .. .. .. .. .. . . . .

  19. read, M read, . . . . . . . M write, , x M x ok RAM program need not touch every bit of memory. RAM programs . . . . . cpu memory small internal state .. .. .. .. .. .. .. . . . .

  20. read, . . . . . M write, , x M x ok RAM program need not touch every bit of memory. RAM programs read, ℓ 1 . M [ ℓ 1 ] . . . . . . cpu memory small internal state .. .. .. .. .. .. .. . . . .

  21. . . . write, , x M x ok RAM program need not touch every bit of memory. RAM programs read, ℓ 1 . M [ ℓ 1 ] read, ℓ 2 . . . . . . . . cpu memory M [ ℓ 2 ] small internal state .. .. .. .. .. .. .. . . . .

  22. RAM program need not touch every bit of memory. RAM programs read, ℓ 1 . M [ ℓ 1 ] read, ℓ 2 . . . . . . . . . . . cpu memory M [ ℓ 2 ] write, ℓ 3 , x small internal state M [ ℓ 3 ] ← x ok .. .. .. .. .. .. .. . . . .

  23. RAM programs read, ℓ 1 . M [ ℓ 1 ] read, ℓ 2 . . . . . . . . . . . cpu memory M [ ℓ 2 ] write, ℓ 3 , x small internal state M [ ℓ 3 ] ← x ok RAM program need not touch every bit of memory. .. .. .. .. .. .. .. . . . .

  24. new state, M new state CPU state CPU state . . . . . . cpu new state, read Imagine they could evaluate CPU-next-instruction function Use (traditional) 2PC protocol to realize CPU-next-instruction Cost = (size of next-instruction function) (number of instructions) Idea: securely evaluate RAM . . . . . memory Basic outline: ▶ Imagine both parties’ inputs stored in large memory .. .. .. .. .. .. .. . . . .

  25. new state, M new state . . . new state, read Use (traditional) 2PC protocol to realize CPU-next-instruction Cost = (size of next-instruction function) (number of instructions) Idea: securely evaluate RAM CPU state CPU state . . . . . . . cpu memory Basic outline: ▶ Imagine both parties’ inputs stored in large memory ▶ Imagine they could evaluate CPU-next-instruction function .. .. .. .. .. .. .. . . . .

  26. new state, M new state CPU state CPU state . . . . Use (traditional) 2PC protocol to realize CPU-next-instruction Cost = (size of next-instruction function) (number of instructions) Idea: securely evaluate RAM . . . . . . cpu memory new state, read ℓ Basic outline: ▶ Imagine both parties’ inputs stored in large memory ▶ Imagine they could evaluate CPU-next-instruction function .. .. .. .. .. .. .. . . . .

  27. CPU state CPU state . . . new state, read Use (traditional) 2PC protocol to realize CPU-next-instruction Cost = (size of next-instruction function) (number of instructions) Idea: securely evaluate RAM new state, M [ ℓ ] new state . . . . . . . cpu memory Basic outline: ▶ Imagine both parties’ inputs stored in large memory ▶ Imagine they could evaluate CPU-next-instruction function .. .. .. .. .. .. .. . . . .

  28. new state, M new state CPU state CPU state . . . . . new state, read Cost = (size of next-instruction function) (number of instructions) Idea: securely evaluate RAM . . . . . cpu memory Basic outline: ▶ Imagine both parties’ inputs stored in large memory ▶ Imagine they could evaluate CPU-next-instruction function ▶ Use (traditional) 2PC protocol to realize CPU-next-instruction .. .. .. .. .. .. .. . . . .

  29. new state, M new state CPU state CPU state . . . . . . cpu new state, read Cost = (size of next-instruction function) (number of instructions) Idea: securely evaluate RAM . . . . . memory Basic outline: ▶ Imagine both parties’ inputs stored in large memory ▶ Imagine they could evaluate CPU-next-instruction function ▶ Use (traditional) 2PC protocol to realize CPU-next-instruction .. .. .. .. .. .. .. . . . .

  30. new state, M CPU state new state CPU state . . . . . . cpu new state, read Idea: securely evaluate RAM . . . . . memory Basic outline: ▶ Imagine both parties’ inputs stored in large memory ▶ Imagine they could evaluate CPU-next-instruction function ▶ Use (traditional) 2PC protocol to realize CPU-next-instruction Cost = (size of next-instruction function) × (number of instructions) .. .. .. .. .. .. .. . . . .

  31. Secret-share the state! Encrypt the memory, augment CPU-next-instruction with encryption/decryption. ??? Calvin must learn these so he knows what to do! M share of state CPU state share of state CPU state . . . . . . . . . . E cpu new state share of new state read share of new state Internal state is public Calvin sees all of the memory Memory access pattern (read , write , ) public! What can go wrong? . . . cpu . memory .. .. .. .. .. .. .. . . . .

  32. Encrypt the memory, augment CPU-next-instruction with encryption/decryption. ??? Calvin must learn these so he knows what to do! M share of state share of state . . . . . . . E cpu share of new state read share of new state Secret-share the state! Calvin sees all of the memory Memory access pattern (read , write , ) public! What can go wrong? CPU state CPU state . . . . . . cpu . memory new state Internal state is public .. .. .. .. .. .. .. . . . .

Recommend


More recommend