a storage efficient and robust private information
play

A Storage-efficient and Robust Private Information Retrieval Scheme - PowerPoint PPT Presentation

A Storage-efficient and Robust Private Information Retrieval Scheme allowing few servers D. Augot, Franc oise Levy-dit-Vehel, Abudllatif Shikfa INRIA, ENSTA, Alcatel-Lucent D. Augot GT-BAC T el ecom, December 11, 2014 - 1/27 Outline


  1. A Storage-efficient and Robust Private Information Retrieval Scheme allowing few servers D. Augot, Franc ¸oise Levy-dit-Vehel, Abudllatif Shikfa INRIA, ENSTA, Alcatel-Lucent D. Augot GT-BAC T´ el´ ecom, December 11, 2014 - 1/27

  2. Outline ◮ LDC codes ◮ Application to PIR ◮ Particular case of Reed-Muller and Derivative codes ◮ A better reduction D. Augot GT-BAC T´ el´ ecom, December 11, 2014 - 2/27

  3. Information Theoretic PIR ◮ User wants to retrieve T [ j ] from a table T on a remote Server S ◮ Property: j and the returned T [ j ] should be unknown to Server ◮ Scenario: ◮ a centralized database of health records is kept by ObamaCare: doctors, nurses, etc, make queries about their patient, without revealing his identity ◮ “information theoretic” : Pr ( j | after the protocol is run ) = Pr ( j ) ◮ With one server, we have a “Shannon-like” Theorem: The whole database has to be downloaded D. Augot GT-BAC T´ el´ ecom, December 11, 2014 - 3/27

  4. Information theoretically secure PIR With several servers: Can be achieved with Locally Decodable Codes (Katz-Trevisan00) D. Augot GT-BAC T´ el´ ecom, December 11, 2014 - 4/27

  5. Coding Definition (Code) Given two alphabets ∆ , Σ , a code is given by its encoding map C : ∆ k → Σ n . ◮ The data (or message) D is transformed C into a (longer) codeword using a generating matrix (linear code) ◮ The rate is k log ∆ n log Σ . D. Augot GT-BAC T´ el´ ecom, December 11, 2014 - 5/27

  6. LDC (Locally Decodable Code) definition A code C : ∆ k → Σ n is ( ℓ, δ ) -locally decodable if: there exists a randomized decoding algorithm A y ( j ) such that: 1. on input j , given oracle access to y ∈ Σ n 2. A makes at most ℓ queries to y : q 1 , . . . , q ℓ ∈ [ 1 , n ] 3. A receives a 1 , . . . , a ℓ ∈ Σ x j = A y ( j , a 1 , . . . , a ℓ ) ∈ ∆ 4. compute ˜ 5. when d ( C ( x ) , y ) < δ n Pr [ A y ( j ) = x j ] ≥ 2 3 Furthermore, the code is smooth if each query q j is uniform random. D. Augot GT-BAC T´ el´ ecom, December 11, 2014 - 6/27

  7. LCC Definition (Locally Correctable Code) A code C ⊂ Σ n is ( ℓ, δ ) -locally correctable if there exists a randomized decoding algorithm A such that: 1. given oracle access to y ∈ Σ n , A makes at most ℓ queries to y . 2. on input i , output A y ( i ) 3. when d ( c , y ) < δ n , Pr [ A y ( i ) = c i ] ≥ 2 3 D. Augot GT-BAC T´ el´ ecom, December 11, 2014 - 7/27

  8. Theorem A F q -linear LCC code C ⊂ F n q can be turned into an LDC code C , Enc : F k q → F n q Proof. ◮ Let I ⊂ [ 1 , n ] be an information set of size k = dim C . ◮ For c ∈ C let c I ∈ F k q denote the restriction of c to coordinates in I . ◮ Given a message x ∈ F k q , we define Enc ( x ) to be the unique element c ∈ C such that c I = x . ◮ Local correctability of C yields local decodability of C . D. Augot GT-BAC T´ el´ ecom, December 11, 2014 - 8/27

  9. PIR Definition (Private Information Retrieval (PIR)) An ℓ -server p -PIR protocol is a triple ( Q , A , R ) of algorithms as follows: 1. With a random string of bits s ; User generates an ℓ -tuple of queries ( q 1 , . . . , q ℓ ) = Q ( j , s ) 2. For 1 ≤ i ≤ ℓ , User sends q i to server S i 3. Each S i answers a i = A ( x , q i ) to User; 4. User recovers x j = R ( a 1 , . . . , a ℓ , j , s ) with probability p . The protocol has Privacy property is Pr ( j | q j ) = Pr ( j ) . D. Augot GT-BAC T´ el´ ecom, December 11, 2014 - 9/27

  10. LDCs in PIR The data D is encoded in a codeword C , which is stored on each server. D. Augot GT-BAC T´ el´ ecom, December 11, 2014 - 10/27

  11. Reed-Muller codes ◮ We use the short-hand notation X i = X i 1 1 · · · X i m X = ( X 1 , . . . , X m ) m , F q [ X ] = F q [ X 1 , . . . , X m ] i = ( i 1 , . . . , i m ) ∈ N m | i | = i 1 + · · · + i m ◮ For i , j ∈ N m , i ≫ j means ∀ u , i u ≥ j u , ◮ the i -th Hasse derivative of F ∈ F q [ X ] , denoted by Hasse ( F , i ) , is � j � � j � � j 1 � � j m � � X j − i Hasse ( F , i )( X ) = with = · · · , f j i i i 1 i m j ≫ i f j ( X + Z ) j = � � Hasse ( F , i )( X ) Z i , F ( X + Z ) = j i D. Augot GT-BAC T´ el´ ecom, December 11, 2014 - 11/27

  12. Restriction to a line ◮ Consider a vector V ∈ F m q \ { 0 } , and a base point P , ◮ consider the restriction of F to the line D = { P + t V : t ∈ F q } , which is a univariate polynomial that we denote by F P , V ( T ) = F ( P + T V ) ∈ F q [ T ] ◮ We have the following relations: � Hasse ( F , j )( P ) V j T | j | , F P , V ( T ) = j � Hasse ( F , j )( P ) V j , coeff ( F P , V , i ) = | j | = i � Hasse ( F , j )( P + α V ) V j , Hasse ( F P , V , i )( α ) = α ∈ F q | j | = i D. Augot GT-BAC T´ el´ ecom, December 11, 2014 - 12/27

  13. Reed-Muller codes ◮ We enumerate F m q = { P 1 , . . . , P n } , ◮ F q [ X ] d is the set of polynomials of degree ≤ d , with d < q . ◮ We encode polynomials using the evaluation map F n ev : F q [ X ] d → q F �→ ( F ( P 1 ) , . . . , F ( P n )) ◮ The d-th order Reed-Muller code is RM d = { ev ( F ) | F ∈ F q [ X ] d } . � d + m � with dimension m ◮ c ∈ RM d is indexed as c = ( c P 1 , . . . , c P n ) , where c i = c P i . D. Augot GT-BAC T´ el´ ecom, December 11, 2014 - 13/27

  14. Local decoding of Reed-Muller codes ◮ F ( X , Y ) restricted to a line D ( T ) is a univariate polynomial ◮ for a given i , R i is random, when the line is random ◮ two R i ’s give the line = ⇒ loss of incertitude (privacy) on P j D. Augot GT-BAC T´ el´ ecom, December 11, 2014 - 14/27

  15. Local decoding of Reed-Muller codes ◮ Given oracle access to y ≈ c = ev F ◮ On input P j , pick a random line D of direction V passing through P j : D ( T ) = { P j + T · V | T ∈ F q } = { R 0 = P j , R 1 . . . , R q − 1 } ⊂ F m q . ◮ R 1 , . . . , R q − 1 are sent as queries, and the decoding algorithm receives ∈ F q − 1 � � y R 1 , . . . , y R q − 1 . q ◮ In case of no errors, � � � � y R 1 , . . . , y R q − 1 = c R 1 , . . . , c R q − 1 , and c R u = F ( P j + α u · V ) = F P , V ( α u ) , α u � = 0 where F P , V = F ( P + T · V ) ∈ F q [ T ] is the restriction of F to D . ◮ F P , V is found with Lagrange interpolation, and c P j = F P , V ( 0 ) . D. Augot GT-BAC T´ el´ ecom, December 11, 2014 - 15/27

  16. Local decoding of Reed-Muller codes, in presence of errors ◮ Given oracle access to y ≈ c = ev F ◮ On input P j , pick a random line D of direction V passing through P j : D ( T ) = { P j + T · V | Tx ∈ F q } = { R 0 = P j , R 1 . . . , R q − 1 } ⊂ F m q . ◮ R 1 , . . . , R q − 1 are sent as queries, and the decoding algorithm receives ∈ F q − 1 � � y R 1 , . . . , y R q − 1 . q ◮ � � � � y R 1 , . . . , y R q − 1 ≈ c R 1 , . . . , c R q − 1 = ev RS ( F ( P + T · V )) = ev RS ( F P , V ) ◮ One can recover F P j , V , using a Reed-Solomon decoding algorithm, and compute c P j = F P j , V ( 0 ) . D. Augot GT-BAC T´ el´ ecom, December 11, 2014 - 16/27

  17. Multiciplity codes (Kopparty-Saraf-Yekhanin2011) � m + s − 1 ◮ Let s ∈ N , and σ = � , we build the evaluation at a point P : m ev s F σ P : F q [ X ] → q �→ ( Hasse ( F , v )( P )) | v | < s F ◮ Consider F q [ X ] d , with d < s ( q − 1 ) , the corresponding code is �� � Mult s ev s � d = P i ( F ) i = 1 ,..., n | F ∈ F q [ X ] d . d : ∆ k → Σ n , with ∆ = F q , and Σ = F σ ◮ It is a code Mult s q . � m + d ◮ The code Mult s � d is a F q -linear with dimension k = . d �� m + s − 1 · q m � � m + d ◮ rate R = � � / m m ◮ minimum distance q m − d s q m − 1 (Generalized Schwartz-Zippel). D. Augot GT-BAC T´ el´ ecom, December 11, 2014 - 17/27

  18. Local Decoding ◮ Two variables, first order derivatives = 1, σ = 3 ev P ( F ) = ( F ( P ) , F ( 1 , 0 ) ( P ) , F ( 0 , 1 ) ( P ) ◮ three random lines are needed, locality is ( q − 1 ) σ D. Augot GT-BAC T´ el´ ecom, December 11, 2014 - 18/27

  19. ◮ Given y = ( y 1 , . . . , y n ) , a noisy version of c = ev s ( F ) ∈ Mult d . ◮ Input j ∈ [ n ] ◮ Pick distinct σ non zero distinct random vectors U 1 , . . . , U σ ◮ For i = 1 to σ : ◮ Consider the line { P j + 0 · U i , P j + α 1 · U i , . . . , P j + α q − 1 · U i } = { R i , 0 , . . . , R i , q − 1 } ◮ Send R i , 1 , . . . , R i , q − 1 , as queries, ◮ Receive the answers: ( y R i , b ) v = Hasse ( F , v )( R i , b ) , ◮ Compute for each point, and for each order 0 ≤ e < s � Hasse ( F , v )( R i , b ) U v Hasse ( F P j , U i , e )( α b ) = i | v | = e ◮ Recover F P j , U i by Hermite interpolation (no error). ◮ Solve for the indeterminates Hasse ( F , v )( P j ) , | v | < s , the system: � t = 0 , . . . , s − 1 , � Hasse ( F , v )( P j ) U v coeff ( F P j , U i , t ) = i . i = 1 , . . . σ | v | = t ◮ Return { Hasse ( F , v )( P j ) , | v | < s } = ev s P j ( F ) . D. Augot GT-BAC T´ el´ ecom, December 11, 2014 - 19/27

Recommend


More recommend