CS573 Data Privacy and Security Cryptographic Primitives and Secure Multiparty Computation Li Xiong
Outline • Cryptographic primitives • Symmetric Encryption • Public Key Encryption • Secure Multiparty Computation • Problem and security definitions • General constructions • Specialized protocols
Basic notation • Plaintext ( m ): – the original message • Ciphertext ( c ): – the coded message • Secret key ( k ): – info used in cipher known only to sender/receiver • Encryption function E k (m) : – performs substitutions/ transformations on plaintext • Decryption function D k (c) : – inverse of encryption algorithm • Efficiency: – functions E K and D K should have efficient algorithms • Consistency: – Decrypting the ciphertext yields the plaintext D K (E K (m)) = m
Operational model of encryption E k (m) ciphertext m D k ’ (E k (m)) = m E D plaintext attacker k k’ encryption key decryption key • Kerckhoff’s assumption: – attacker knows E and D – attacker doesn’t know the (decryption) key • attacker’s goal: – to systematically recover plaintext from ciphertext – to deduce the (decryption) key • attack models: – ciphertext-only (COA) – known-plaintext (KPA) – (adaptive) chosen-plaintext (CPA) – (adaptive) chosen-ciphertext (CCA) 4
Symmetric Encryption • or conventional / secret-key / single-key • sender and recipient share a common key • Scenario: – Alice wants to send a message (plaintext P) to Bob – The communication channel is insecure and can be eavesdropped – If Alice and Bob have previously agreed on a symmetric encryption scheme and a secret key K, the message can be sent encrypted (ciphertext C)
Symmetric Key Cryptography K K A-B A-B encryption decryption ciphertext plaintext plaintext algorithm algorithm message, m m = K ( ) c=K A-B (m) K (m) A-B A-B symmetric key crypto: Bob and Alice share the same (symmetric) key: K A-B
Outline • Cryptographic primitives • Symmetric Encryption • Public Key Encryption • Secure Multiparty Computations • Problem and security definitions • General constructions
Private-Key Cryptography • traditional private/secret/single key cryptography uses one key • Sender and receiver must share the same key • needs secure channel for key distribution • impossible for two parties having no prior relationship • if this key is disclosed communications are compromised • also is symmetric , parties are equal • hence does not protect sender from receiver forging a message & claiming is sent by sender
Public-Key Cryptography • uses two keys – a public & a private key • asymmetric since parties are not equal • complements rather than replaces private key crypto – neither more secure than private key (security depends on the key size for both) – nor do they replace private key schemes (they are too slow to do so)
Public-Key Cryptography • public-key/two-key/asymmetric cryptography involves the use of two keys: – a public-key , which may be known by anybody, and can be used to encrypt messages , and verify signatures – a private-key , known only to the recipient, used to decrypt messages , and sign (create) signatures • Encryption: c = E pk (m) • Decryption: m = D sk (c) • is asymmetric because – those who encrypt messages or verify signatures cannot decrypt messages or create signatures
Public-Key Cryptography
Public-Key Characteristics • Public-Key algorithms rely on two keys with the characteristics that it is: – computationally infeasible to find decryption key knowing only algorithm & encryption key – computationally easy to en/decrypt messages when the relevant (en/decrypt) key is known – either of the two related keys can be used for encryption, with the other used for decryption (in some schemes) – Many can encrypt, only one can decrypt
RSA (Rivest, Shamir, Adleman, 1978) • basis • Example: • • Setup: intractability of integer factoring • Setup: • p =7, q =17 • • select p , q large primes n = 7*17 = 119 • • n = pq , ф (n) = ( p-1 )( q-1 ) ф (n) = 6 *16 = 96 • • select e relatively prime to ф (n) e = 5 • • compute d such that ed mod ф (n) = 1 d = 77 • Keys: • Keys: • • public key : K E = (n,e) public key: K E = (119, 5) • • private key: K D = d private key: K D = 77 • Encryption: • Encryption: • • Plaintext m m = 19 c = m e mod n c = 19 5 mod 119 = 66 • • • Decryption: • Decryption: m= c d mod n m = 66 77 mod 119 = 19 • • 13
Outline • Cryptographic primitives • Symmetric Encryption • Public Key Encryption • Secure Multiparty Computations • Problem and security definitions • General constructions
Motivation • General framework for describing computation between parties who do not trust each other • Example: elections – N parties, each one has a “Yes” or “No” vote – Goal: determine whether the majority voted “Yes”, but no voter should learn how other people voted • Example: auctions – Each bidder makes an offer • Offer should be committing! (can’t change it later) – Goal: determine whose offer won without revealing losing offers slide 15
More Examples • Example: distributed computation/data mining/machine learning – Two companies want to perform computation/learning over their datasets without revealing them • Compute the intersection of two lists of names • Distributed learning • Example: private queries on secure database – Evaluate a query on the database without revealing the query to the database owner and without revealing data to the querier – Many variations slide 16
Secure Multiparty Computation • A set of parties with private inputs wish to compute some joint function of their inputs. • Parties wish to preserve some security properties. e.g., privacy and correctness. • Security must be preserved in the face of adversarial behavior by some of the participants, or by an external party.
Yao’s Millionaire Problem • Two millionaires, Alice and Bob, who are interested in knowing which of them is richer without revealing their actual wealth. • This problem is analogous to a more general problem where there are two numbers a and b and the goal is to solve the inequality without revealing the actual values of a and b .
How to Define Security? • Must be mathematically rigorous • Must capture all realistic attacks that a malicious participant may try to stage • Should be “abstract” – Based on the desired “functionality” of the protocol, not a specific protocol – Goal: define security for an entire class of protocols slide 20
Functionality • K mutually distrustful parties want to jointly carry out some task • Model this task as a function f: ({0,1}*) K ({0,1}*) K K outputs K inputs (one per party); each input is a bitstring • Assume that this functionality is computable in probabilistic polynomial time slide 22
Defining Security • The real/ideal model paradigm for defining security [GMW,GL,Be,MR,Ca] : – Ideal model: parties send inputs to a trusted party, who computes the function for them – Real model: parties run a real protocol with no trusted help • A protocol is secure if any attack on a real protocol can be carried out in the ideal model
Ideal Model • Intuitively, we want the protocol to behave “as if” a trusted third party collected the parties’ inputs and computed the desired functionality – Computation in the ideal model is secure by definition! x 2 x 1 A B f 1 (x 1 ,x 2 ) f 2 (x 1 ,x 2 ) slide 24
More Formally • A protocol is secure if it emulates an ideal setting where the parties hand their inputs to a “trusted party,” who locally computes the desired outputs and hands them back to the parties [Goldreich-Micali-Wigderson 1987] x 2 x 1 A B f 1 (x 1 ,x 2 ) f 2 (x 1 ,x 2 ) slide 25
Real world • No trusted third party • Participants run some protocol amongst themselves without any help • Despite that, secure protocol should emulate an ideal setting. • Real protocol that is run by the participants is secure if – no adversary can do more harm in real execution than an execution that takes place in the ideal world
Adversary Models • Some of protocol participants may be corrupt – If all were honest, would not need secure multi-party computation • Semi-honest (aka passive; honest-but-curious) – Follows protocol, but tries to learn more from received messages than he would learn in the ideal model • Malicious – Deviates from the protocol in arbitrary ways, lies about his inputs, may quit at any point • For now, we will focus on semi-honest adversaries and two-party protocols slide 27
Properties of the Definition • How do we argue that the real protocol “emulates” the ideal protocol? • Correctness – All honest participants should receive the correct result of evaluating function f • Because a trusted third party would compute f correctly • Privacy – All corrupt participants should learn no more from the protocol than what they would learn in ideal model – What does corrupt participant learn in ideal model? • His input (obviously) and the result of evaluating f slide 28
Recommend
More recommend