Lattice Cryptography: Towards Fully Homomorphic Encryption Lecture 20
Recall Learning With Errors s where ≈ = + A b A r b A e LWE (decision version): (A,A s + e ) ≈ (A, r ), where A random m × n , s uniform, e has “small” entries from a matrix in A ∈ Z q Gaussian distribution, and r uniform. Average-case solution for LWE ⇒ Worst-case solution for GapSVP (for appropriate choice of parameters)
Learning With Errors -s where ≈ = A b A r A b e 1 LWE (decision version): (A,A s + e ) ≈ (A, r ), where A random m × n , s uniform, e has “small” entries from a matrix in A ∈ Z q Gaussian distribution, and r uniform. Average-case solution for LWE ⇒ Worst-case solution for GapSVP (for appropriate choice of parameters)
Learning With Errors where ≈ = M A r M z e m × n’ and non-zero z ∈ Z q n’ i.e., a pseudorandom matrix M ∈ Z q s.t. entries of M z are all small (n’=n+1)
Recall PKE from LWE v -s T e T 1 0 A T = a a v b T 1 1 Ciphertext
Recall PKE from LWE v -s T z T e T 1 0 = M T m a a 1 1 Ciphertext Ciphertext = M T a + m where m encodes the message and a ∈ {0,1} m Decryptng: From z T (M T a + m ) = e T a + z T m where e T a is small. To allow decoding from this for, say μ ∈ {0,1}, let z T m = v ≈ μ (q/2). CPA security: M T a is pseudorandom m × n' is truly random, a ∈ {0,1} m \{0 m }, m >> n’ log q, Claim: If M ∈ Z q then M T a is very close to being uniform
Randomness Extraction Entries in a are not uniformly random over Z qm , but concentrated n’ on a small subset {0,1} m . We need M T a to be uniform over Z q Follows from two more generally useful facts: H M ( a ) = M T a is a 2-Universal Hash Function (for non-zero a ) If H is a 2-UHF , then it is a good randomness extractor If m >> n’ log q, the entropy of a (m bits) is significantly n’ and a good more than that of a uniform vector in Z q randomness extractor will produce an almost uniform output
Universal Hashing Combinatorial HF: A → (x,y); h ← H . h(x)=h(y) w.n.p x h 1 (x) h 2 (x) h 3 (x) h 4 (x) Even better: 2-Universal Hash Functions 0 0 0 1 1 “Uniform” and “Pairwise-independent” 1 0 1 0 1 ∀ x,z Pr h ← H [ h(x)=z ] = 1/|Z| (where h:X → Z) 2 1 0 0 1 ∀ x ≠ y,w,z Pr h ← H [ h(x)=w, h(y)=z ] = 1/|Z| 2 ⇒ ∀ x ≠ y Pr h ← H [ h(x)=h(y) ] = 1/|Z| Negligible collision-probability if super-polynomial-sized range e.g. h a,b (x) = ax+b (in a finite field, X=Z) Pr a,b [ ax+b = z ] = Pr a,b [ b = z-ax ] = 1/|Z| Pr a,b [ ax+b = w, ay+b = z] = ? Exactly one (a,b) satisfying the two equations (for x ≠ y) Pr a,b [ ax+b = w, ay+b = z] = 1/|Z| 2 Exercise: M x (M random matrix) is a 2-UHF for non-zero boolean x
Randomness Extractor Seed randomness Input has high “min-entropy" i.e., probability of any particular Almost input string is very low unbiased output Ext Seed uniform and independent Biased input of input Output vector is shorter than the input Ext(inp,seed) ) ≈ Uniform Statistical closeness A strong extractor : (seed, Ext(inp,seed) ) ≈ (seed,Uniform) i.e., for any input distribution, most choices of seed yield a good deterministic extractor
Randomness Extractor Seed randomness Leftover Hash Lemma: Almost unbiased Any 2-UHF is a strong extractor output Ext Biased input that can extract almost all of the min-entropy in the input A very useful result We need only a special case here: Only for a particular 2-UHF (H M ( x ) = M x ) Only for a particular input distribution ( x uniform over {0,1} m )
Recall PKE from LWE v -s T z T e T 1 0 = M T m a a 1 1 Ciphertext Ciphertext = M T a + m where m encodes the message and a ∈ {0,1} m Decryptng: From z T (M T a + m ) = e T a + z T m where e T a is small. To allow decoding from this for, say μ ∈ {0,1}, let z T m = v ≈ μ (q/2). CPA security: M T a is pseudorandom m × n' is truly random, a ∈ {0,1} m \{0 m }, m >> n’ log q, Claim: If M ∈ Z q then M T a is very close to being uniform
Gentry-Sahai-Waters Want to allow homomorphic operations on the ciphertext Idea: Ciphertext is a matrix masked by a pseudorandom matrix that can be “annihilated” with secret key. Addition and multiplication of messages given by addition and multiplication of ciphertexts. m × n and z ∈ Z q n s.t. z T M T has small entries Recall from LWE: M ∈ Z q = e T z T M T First attempt: Public-Key = M, Secret-key = z Enc( μ ) = M T R + μ I where μ∈ {0,1}, R ← {0,1} m × n , and I n × n identity Security: LWE (and LHL) ⇒ M T R is pseudorandom Dec z (C) : z T C = e T R + μ z T has “error” δ T =e T R. Can recover μ since error has small entries (w.h.p.)
Gentry-Sahai-Waters First attempt: Enc( μ ) = M T R + μ I Dec z (C) : z T C = e T R + μ z T has error δ T =e T R C 1 +C 2 = M T (R 1 +R 2 ) + ( μ 1 + μ 2 ) I has error δ T = δ 1T + δ 2T Error adds up with each operation OK if there is an a priori bound on the depth of computation: Levelled Homomorphic Encryption C 1 × C 2 : Error = ? z T C 1 C 2 = ( δ 1T + μ 1 z T )C 2 = δ 1T C 2 + μ 1 ( δ 2T + μ 2 z T ) Error = δ 1T C 2 + μ 1 δ 2T Problem: Entries in δ 1T C 2 may not be small, as entries in C 2 are not small! (Since μ 1 ∈ {0,1}, μ 1 δ 2T does have small entries)
Gentry-Sahai-Waters Problem: Entries in δ 1T C 2 may not be small Solution Idea: Represent ciphertext as bits! But homomorphic operations will be affected Observation: Reconstructing a number from bits is a linear operation m has bit-representation B( α ) ∈ {0,1} km (k=O(log q)), If α ∈ Z q m × km (all operations in Z q ) then G B( α ) = α , where G ∈ Z q m × n → Z q km × n and B can be applied to matrices also as B : Z q we have G B( α ) = α
Gentry-Sahai-Waters The Actual Scheme Supports messages μ ∈ {0,1} and NAND operations up to an a priori bounded depth of NANDs m × n and private key z s.t. z T M has small entries Public key M ∈ Z q Enc( μ ) = M T R + μ G where R ← {0,1} m × km (and G ∈ Z q n × km the matrix to reverse bit-decomposition) Dec z (C) : z T C = δ T + μ z T G where δ T =e T R Decrypting G yields 1 NAND(C 1 ,C 2 ) : G - C 1 ⋅ B(C 2 ) z T C 1 ⋅ B(C 2 ) = z T C 1 ⋅ B(C 2 ) = ( δ 1T + μ 1 z T G) B(C 2 ) Only “left depth” counts, since = δ 1T B(C 2 ) + μ 1 z T C 2 = δ T + μ 1 μ 2 z T G δ ≤ k ⋅ m ⋅ δ 1 + δ 2 where δ T = δ 1T B(C 2 ) + μ 1 δ 2T has small entries In general, error gets multiplied by km. Allows depth ≈ log km q
Recommend
More recommend