proof carrying data
play

Proof-Carrying Data: secure computation on untrusted execution - PowerPoint PPT Presentation

Proof-Carrying Data: secure computation on untrusted execution platforms Eran Tromer Joint work with Alessandro Chiesa Eli Ben-Sasson Daniel Genkin 1 Technion Cryptoday June 16, 2011 Motivation 3 Motivation INTEGRITY CONFIDENTIALITY


  1. Proof-Carrying Data: secure computation on untrusted execution platforms Eran Tromer Joint work with Alessandro Chiesa Eli Ben-Sasson Daniel Genkin 1 Technion Cryptoday June 16, 2011

  2. Motivation 3

  3. Motivation INTEGRITY CONFIDENTIALITY • Software engineering (review, tests) • Bugs SOFTWARE • Formal verification, static analysis • Trojans • Language type safety • Dynamic analysis • Reference monitors 4

  4. Motivation INTEGRITY CONFIDENTIALITY • Software engineering (review, tests) • Bugs SOFTWARE • Formal verification, static analysis • Trojans • Language type safety ? • Dynamic analysis • Reference monitors • Lack of trust NETWORK 5

  5. Motivation INTEGRITY CONFIDENTIALITY • Bugs SOFTWARE • Trojans • Lack of trust NETWORK • Physical ENVIRONMENT • Tampering side-channels (EM, power, acoustic) 6

  6. Motivation INTEGRITY CONFIDENTIALITY • Bugs SOFTWARE • Trojans • Lack of trust NETWORK • Physical ENVIRONMENT • Tampering side-channels 8

  7. Motivation INTEGRITY CONFIDENTIALITY • Bugs SOFTWARE • Trojans • Lack of trust NETWORK • Physical ENVIRONMENT • Tampering side-channels • Cosmic rays • Hardware bugs PLATFORM • Hardware trojans • IT supply chain 9

  8. Information technology supply chain: headlines ( May 9, 2008) “F.B.I. Says the Military Had Bogus Computer Gear” ( October 6, 2008) “Chinese counterfeit chips causing military hardware crashes” (May 6, 2010) “A Saudi man was sentenced […] to four years in prison for selling counterfeit computer parts to the Marine Corps DARPA Trust in ICs for use in Iraq and Afghanistan.” Argonne APS Assurance? Validation? Certification? 10

  9. Motivation INTEGRITY CONFIDENTIALITY • Bugs SOFTWARE • Trojans • Lack of trust NETWORK • Physical ENVIRONMENT • Tampering side-channels • Cosmic rays • Hardware bugs PLATFORM • Hardware trojans • IT supply chain 11

  10. Motivation INTEGRITY CONFIDENTIALITY • Bugs SOFTWARE • Trojans • Lack of trust NETWORK • Physical ENVIRONMENT • Tampering side-channels • Cosmic rays Fault analysis • Hardware bugs • Architectural PLATFORM • Hardware trojans side-channels • IT supply chain ( e.g., cache attacks) 12

  11. Information Leakage in Third-Party Compute Clouds [Ristenpart Tromer Shacham Savage ‘09] Demonstrated, using Amazon EC2 as a study case: • Cloud cartography Mapping the structure of the “cloud” and locating a target on the map. • Placement vulnerabilities An attacker can place his VM on the same physical machine as a target VM (40% success for a few dollars). • Cross-VM exfiltration Once VMs are co-resident, information can be exfiltrated across VM boundary: – Covert channels – Load traffic analysis – Keystrokes 23

  12. Motivation CORRECTNESS SECRECY • Bugs SOFTWARE • Trojans • Lack of trust NETWORK • Physical ENVIRONMENT • Tampering side-channels • Cosmic rays • Fault analysis • Hardware bugs • Architectural PLATFORM • Hardware trojans side-channels • IT supply chain ( e.g., cache attacks) 24

  13. High-level goal ���������������������� ����������������������� ���������������� �������������������� ������������� � ���������� 25

  14. Proof-Carrying Data overview 27

  15. Proof-Carrying Data: an example 28

  16. Toy example (3-party correctness) Alice Bob Carol x, F G y z y ← F(x) z ← G(y) is “z=G(F(x))” true? 31

  17. Toy example: trivial solution Bob Alice Carol y z z ← G(y) z’ ← G(F(x)) y ← F(x) ? z’ = z Carol can recompute everything, but: • Uselessly expensive • Requires Carol to fully know x,F,G – We will want to represent these via short hashes/signatures 32

  18. Toy example: secure multiparty computation [GMW87][BGW88][CCD88] Alice Bob Carol x, F G z ← G(y) y ← F(x) But: • computational blowup is polynomial in the whole computation, and not in the local computation • computation (F and G) must be chosen in advance • does not preserve the communication graph: parties must be fixed in advance, otherwise… 33

  19. Toy example: secure multiparty computation [GMW87][BGW88][CCD88] Alice Bob x, F G Carol #1 z ← G(y) y ← F(x) Carol #2 Carol #3 ... must pre-emptively talk ... must pre-emptively talk ... must pre-emptively talk to everyone on the Internet! to everyone on the Internet! to everyone on the Internet! 34

  20. Toy example: computationally-sound (CS) proofs [Micali 94] Alice Bob Carol x, F G y z verify y ← F(x) z ← G(y) π z π z π z ← prove ( “z=G(F(x))” ) z=G(F(x)) Bob can generate a proof string that is: • Tiny (polylogarithmic in his own computation) • Efficiently verifiable by Carol However, now Bob recomputes everything... 35

  21. Toy example: Proof-Carrying Data [Chiesa Tromer 09] following Incrementally-Verifiable Computation [Valiant 08] Alice Bob Carol x, F G y z verify z ← G(y) y ← F(x) π z π y π z z=G(y) y=F(x) and I got a valid proof that “y=F(x)” Each party prepares a proof string for the next one. Each proof is: • Tiny (polylogarithmic in party’s own computation). • Efficiently verifiable by the next party. 36

  22. Generalizing: The Proof-Carrying Data framework 37

  23. Generalizing: distributed computations Distributed computation: Parties exchange messages and perform computation. m 3 m out 38

  24. Generalizing: arbitrary interactions • Arbitrary interactions – communication graph over time is any direct acyclic graph m 3 m out 39

  25. Generalizing: arbitrary interactions • Computation and graph are determined on the fly – by each party’s local inputs: human inputs randomness program m 3 m out 40

  26. Generalizing: arbitrary interactions • Computation and graph are determined on the fly – by each party’s local inputs: human inputs randomness program How to define m 3 correctness of dynamic distributed m out computation? 41

  27. C -compliance System designer specifies his notion of correctness via a compliance predicate C (in,code,out) that must be locally fulfilled at every node. code (program, human inputs, randomness) C accept / reject C -compliant C -compliant C -compliant in out distributed distributed distributed computation computation computation m 3 m out m 5 42

  28. Examples of C -compliance correctness is a compliance predicate C (in,code,out) that must be locally fulfilled at every node Some examples: C C = “the output is the result of correctly computing a prescribed program” C C = “the output is the result of correctly executing some program signed by the sysadmin” C C = “the output is the result of correctly executing some type-safe program” or “… “program with a valid formal proof” m 3 m out m 5 43

  29. Dynamically augment computation with proofs strings In PCD, messages sent between parties are augmented with concise proof strings attesting to their “compliance”. Distributed computation evolves like before, except that each party also generates on the fly a proof string to attach to each output message. C m 3 π 3 m out π out 45

  30. Extra setup (“model”) Every node has access to a simple, fixed, stateless trusted functionality • Signed-Input-and-Randomness (SIR) oracle C m 3 π 3 SIR SIR m out SIR π out SIR SIR SIR 46

  31. Extra setup (“model”) Every node has access to a simple, fixed, stateless trusted functionality: essentially, a signature card. • Signed-Input-and-Randomness (SIR) oracle x s SIR SK input length r ← {0,1} s string σ r σ ← SIGN SK (x , r) random signature string on (x,r) VK 47

  32. (Some) envisioned applications 49

  33. Application: Correctness and integrity of IT supply chain • Consider a system as a collection of components, with specified functionalities – Chips on a motherboard – Servers in a datacenter – Software modules • C (in,code,out) checks if the component’s specification holds • Proofs are attached across component boundaries • If a proof fails, computation is locally aborted → integrity, attribution 50

  34. Application: Fault and leakage resilient Information Flow Control 51

  35. Application: Fault and leakage resilient Information Flow Control • Computation gets “ secret ” / “ non-secret ” inputs • “ non-secret ” inputs are signed as such • Any output labeled “ non-secret ” must be independent of secret s • System perimeter is controlled and all output can be checked (but internal computation can be leaky/faulty). • C allows only: – Non-secret inputs: Initial inputs must be signed as “ non-secret ”. – IFC-compliant computation: Subsequent computation respect Information Flow Control rules and follow fixed schedule • Censor at system’s perimeter inspects all outputs: – Verifies proof on every outgoing message – Releases only non-secret data. 52

Recommend


More recommend