Secure Computation with Sublinear Cost Mike Rosulek . Collaborators: Arash Afshar / Zhangxiang Hu / Payman Mohassel .. .. .. .. .. .. .. . . .
. . . f x y f x y Examples: Run proprietary classifier x on private data y Evaluate statistics on combined medical records x & y Secure 2-party computation y x . . . . .. .. .. .. .. .. .. . . .
. . . f x y f x y Examples: Run proprietary classifier x on private data y Evaluate statistics on combined medical records x & y Secure 2-party computation y x . . . . .. .. .. .. .. .. .. . . .
. Examples: Run proprietary classifier x on private data y Evaluate statistics on combined medical records x & y Secure 2-party computation y x . . . . . . f ( x , y ) f ( x , y ) .. .. .. .. .. .. .. . . .
. . f x y Examples: Run proprietary classifier x on private data y Evaluate statistics on combined medical records x & y Secure 2-party computation y x . . . . . . f ( x , y ) .. .. .. .. .. .. .. . . .
. . f x y Secure 2-party computation y x . . . . . . f ( x , y ) Examples: ▶ Run proprietary classifier x on private data y ▶ Evaluate statistics on combined medical records x & y ▶ · · · .. .. .. .. .. .. .. . . . .
f x y doesn’t depend on these bits of y protocol never touches these y . . . . . . f x y Example: y genetic database x DNA markers f x y diagnosis in general , security demands that all of the data is touched Fundamental Limits y x . . . . . .. .. .. .. .. .. .. . . . .
f x y doesn’t depend on these bits of y . . Example: y genetic database x DNA markers f x y diagnosis in general , security demands that all of the data is touched Fundamental Limits protocol never touches these y y x . . . . . . . . f ( x , y ) .. .. .. .. .. .. .. . . . .
protocol never touches these . . in general , security demands that all of the data is touched Fundamental Limits f ( x , y ) doesn’t depend on these bits of y y y x . . . . . . . f ( x , y ) Example: ▶ y = genetic database ▶ x = DNA markers ▶ f ( x , y ) = diagnosis .. .. .. .. .. .. .. . . . .
protocol never touches these . . Fundamental Limits f ( x , y ) doesn’t depend on these bits of y y y x . . . . . . . f ( x , y ) Example: ▶ y = genetic database ▶ x = DNA markers ▶ f ( x , y ) = diagnosis ⇒ in general , security demands that all of the data is touched .. .. .. .. .. .. .. . . . .
. “to securely evaluate f, first express f as a boolean circuit, then ...” Limits of Standard Techniques . . .. .. .. .. .. .. .. . . . .
“to securely evaluate f, first express f as a boolean circuit, then ...” Limits of Standard Techniques . . .. .. .. .. .. .. .. . . . .
Limits of Standard Techniques . . “to securely evaluate f, first express f as a boolean circuit, then ...” .. .. .. .. .. .. .. . . . .
. Limits of Standard Techniques . . “to securely evaluate f, first express f as a boolean circuit, then ...” .. .. .. .. .. .. .. . . . .
2: General-purpose 2PC scales with size of circuit representa- tion , which is always at least linear in input size. What We’re Up Against 1: . Security requires protocol cost at least linear in size of in- puts (in general!) .. .. .. .. .. .. .. . . . .
What We’re Up Against 1: . Security requires protocol cost at least linear in size of in- puts (in general!) 2: General-purpose 2PC scales with size of circuit representa- tion , which is always at least linear in input size. .. .. .. .. .. .. .. . . .
2: Protocol must “touch every bit”, but amortize this cost across many executions. In this talk: 1: . Instead of circuits, use a representation that can actually be sublinear in size. .. .. .. .. .. .. .. . . . .
In this talk: 1: . Instead of circuits, use a representation that can actually be sublinear in size. 2: Protocol must “touch every bit”, but amortize this cost across many executions. .. .. .. .. .. .. .. . . . .
read, M read, . . . . . . . M write, , x M x ok RAM program need not touch every bit of memory. RAM programs . . . . . cpu memory small internal state .. .. .. .. .. .. .. . . . .
read, . . . . . M write, , x M x ok RAM program need not touch every bit of memory. RAM programs read, ℓ 1 . M [ ℓ 1 ] . . . . . . cpu memory small internal state .. .. .. .. .. .. .. . . . .
. . . write, , x M x ok RAM program need not touch every bit of memory. RAM programs read, ℓ 1 . M [ ℓ 1 ] read, ℓ 2 . . . . . . . . cpu memory M [ ℓ 2 ] small internal state .. .. .. .. .. .. .. . . . .
RAM program need not touch every bit of memory. RAM programs read, ℓ 1 . M [ ℓ 1 ] read, ℓ 2 . . . . . . . . . . . cpu memory M [ ℓ 2 ] write, ℓ 3 , x small internal state M [ ℓ 3 ] ← x ok .. .. .. .. .. .. .. . . . .
RAM programs read, ℓ 1 . M [ ℓ 1 ] read, ℓ 2 . . . . . . . . . . . cpu memory M [ ℓ 2 ] write, ℓ 3 , x small internal state M [ ℓ 3 ] ← x ok RAM program need not touch every bit of memory. .. .. .. .. .. .. .. . . . .
new state, M new state CPU state CPU state . . . . . . cpu new state, read Imagine they could evaluate CPU-next-instruction function Use (traditional) 2PC protocol to realize CPU-next-instruction Cost = (size of next-instruction function) (number of instructions) Idea: securely evaluate RAM . . . . . memory Basic outline: ▶ Imagine both parties’ inputs stored in large memory .. .. .. .. .. .. .. . . . .
new state, M new state . . . new state, read Use (traditional) 2PC protocol to realize CPU-next-instruction Cost = (size of next-instruction function) (number of instructions) Idea: securely evaluate RAM CPU state CPU state . . . . . . . cpu memory Basic outline: ▶ Imagine both parties’ inputs stored in large memory ▶ Imagine they could evaluate CPU-next-instruction function .. .. .. .. .. .. .. . . . .
new state, M new state CPU state CPU state . . . . Use (traditional) 2PC protocol to realize CPU-next-instruction Cost = (size of next-instruction function) (number of instructions) Idea: securely evaluate RAM . . . . . . cpu memory new state, read ℓ Basic outline: ▶ Imagine both parties’ inputs stored in large memory ▶ Imagine they could evaluate CPU-next-instruction function .. .. .. .. .. .. .. . . . .
CPU state CPU state . . . new state, read Use (traditional) 2PC protocol to realize CPU-next-instruction Cost = (size of next-instruction function) (number of instructions) Idea: securely evaluate RAM new state, M [ ℓ ] new state . . . . . . . cpu memory Basic outline: ▶ Imagine both parties’ inputs stored in large memory ▶ Imagine they could evaluate CPU-next-instruction function .. .. .. .. .. .. .. . . . .
new state, M new state CPU state CPU state . . . . . new state, read Cost = (size of next-instruction function) (number of instructions) Idea: securely evaluate RAM . . . . . cpu memory Basic outline: ▶ Imagine both parties’ inputs stored in large memory ▶ Imagine they could evaluate CPU-next-instruction function ▶ Use (traditional) 2PC protocol to realize CPU-next-instruction .. .. .. .. .. .. .. . . . .
new state, M new state CPU state CPU state . . . . . . cpu new state, read Cost = (size of next-instruction function) (number of instructions) Idea: securely evaluate RAM . . . . . memory Basic outline: ▶ Imagine both parties’ inputs stored in large memory ▶ Imagine they could evaluate CPU-next-instruction function ▶ Use (traditional) 2PC protocol to realize CPU-next-instruction .. .. .. .. .. .. .. . . . .
new state, M CPU state new state CPU state . . . . . . cpu new state, read Idea: securely evaluate RAM . . . . . memory Basic outline: ▶ Imagine both parties’ inputs stored in large memory ▶ Imagine they could evaluate CPU-next-instruction function ▶ Use (traditional) 2PC protocol to realize CPU-next-instruction Cost = (size of next-instruction function) × (number of instructions) .. .. .. .. .. .. .. . . . .
Secret-share the state! Encrypt the memory, augment CPU-next-instruction with encryption/decryption. ??? Calvin must learn these so he knows what to do! M share of state CPU state share of state CPU state . . . . . . . . . . E cpu new state share of new state read share of new state Internal state is public Calvin sees all of the memory Memory access pattern (read , write , ) public! What can go wrong? . . . cpu . memory .. .. .. .. .. .. .. . . . .
Encrypt the memory, augment CPU-next-instruction with encryption/decryption. ??? Calvin must learn these so he knows what to do! M share of state share of state . . . . . . . E cpu share of new state read share of new state Secret-share the state! Calvin sees all of the memory Memory access pattern (read , write , ) public! What can go wrong? CPU state CPU state . . . . . . cpu . memory new state Internal state is public .. .. .. .. .. .. .. . . . .
Recommend
More recommend