communication complexity of private simultaneous messages
play

Communication Complexity of Private Simultaneous Messages, Revisited - PowerPoint PPT Presentation

Communication Complexity of Private Simultaneous Messages, Revisited Manoj Mishra Department of Electrical Engineering - Systems Tel Aviv University Joint work with Benny Applebaum (TAU), Thomas Holenstein (Google), Ofer Shayevitz (TAU)


  1. Revisiting P.S.M. Lowerbound M A M B 0 a 1 b 1 0 Y X 1 a 2 b 2 r 1 0 x 1 y 1 1 0 . 1 . 0 1 x 2 y 2 r 2 0 · · 1 . . 1 · · 0 1 x K y K 1 0 1 . . . . . . . . . a J b L

  2. Revisiting P.S.M. Lowerbound M A M B 0 a 1 b 1 0 Y X 1 a 2 b 2 r 1 0 x 1 y 1 1 0 . 1 . 0 1 x 2 y 2 r 2 0 · · 1 . . 1 · · 0 1 x K y K 1 0 1 . . . . . . . . . a J b L

  3. Revisiting P.S.M. Lowerbound M A M B 0 a 1 b 1 0 Y X 1 a 2 b 2 r 1 0 x 1 y 1 1 0 . 1 . 0 1 x 2 y 2 r 2 0 · · 1 . . 1 · · 0 1 x K y K 1 r 3 0 1 . . 0 1 . . 1 0 1 . . . . . a J b L

  4. Revisiting P.S.M. Lowerbound M A M B 0 a 1 b 1 0 Y X 1 a 2 b 2 r 1 0 x 1 y 1 1 0 . 1 . 0 1 x 2 y 2 r 2 0 · · 1 . . 1 · · 0 1 x K y K 1 r 3 0 1 . . 0 1 . . 1 0 1 . . . . . a J b L

  5. Revisiting P.S.M. Lowerbound M A M B 0 a 1 b 1 0 Y X 1 a 2 b 2 r 1 0 x 1 y 1 1 0 . 1 . 0 1 x 2 y 2 r 2 0 · · 1 . . 1 · · 0 1 x K y K 1 r 3 0 1 . . 0 1 . . 1 Communication: log |M A | + log |M B | 0 1 . . . . . a J b L

  6. Revisiting P.S.M. Lowerbound M A M B 0 a 1 b 1 0 Y X 1 a 2 b 2 r 1 0 x 1 y 1 1 0 . 1 . 0 1 x 2 y 2 r 2 0 · · 1 . . 1 · · 0 1 x K y K 1 r 3 0 1 . . 0 1 . . Mechanism: 1 0 1 . . . . . a J b L

  7. Revisiting P.S.M. Lowerbound M A M B 0 a 1 b 1 0 Y X 1 a 2 b 2 r 1 0 x 1 y 1 1 0 . 1 . 0 1 x 2 y 2 r 2 0 · · 1 . . 1 · · 0 1 x K y K 1 r 3 0 1 . . 0 1 . . Mechanism: 1 • Lowerbound number of r ’s 0 1 . . . . . a J b L

  8. Revisiting P.S.M. Lowerbound M A M B 0 a 1 b 1 0 Y X 1 a 2 b 2 r 1 0 x 1 y 1 1 0 . 1 . 0 1 x 2 y 2 r 2 0 · · 1 . . 1 · · 0 1 x K y K 1 r 3 0 1 . . 0 1 . . Mechanism: 1 • Lowerbound number of r ’s 0 1 . . • Lowerbound size of each image set . . . a J b L

  9. Revisiting P.S.M. Lowerbound M A M B 0 a 1 b 1 0 Y X 1 a 2 b 2 r 1 0 x 1 y 1 1 0 . 1 . 0 1 x 2 y 2 r 2 0 · · 1 . . 1 · · 0 1 x K y K 1 r 3 0 1 . . 0 1 . . Mechanism: 1 • Lowerbound number of r ’s 0 1 . . • Lowerbound size of each image set . . • Upperbound size of overlap between two image sets . a J b L

  10. Revisiting P.S.M. Lowerbound M A M B 0 a 1 b 1 0 Y X 1 a 2 b 2 r 1 0 x 1 y 1 1 0 . 1 . 0 1 x 2 y 2 r 2 0 · · 1 . . 1 · · 0 1 x K y K 1 r 3 0 1 . . 0 1 Assumption 1 on f : . . 1 0 1 . . . . . a J b L

  11. Revisiting P.S.M. Lowerbound M A M B 0 a 1 b 1 0 Y X 1 a 2 b 2 r 1 0 x 1 y 1 1 0 . 1 . 0 1 x 2 y 2 r 2 0 · · 1 . . 1 · · 0 1 x K y K 1 r 3 0 1 . . 0 1 Assumption 1 on f : . . 1 • f non-degenerate: 0 1 . . . . . a J b L

  12. Revisiting P.S.M. Lowerbound M A M B 0 a 1 b 1 0 Y X 1 a 2 b 2 r 1 0 x 1 y 1 1 0 . 1 . 0 1 x 2 y 2 r 2 0 · · 1 . . 1 · · 0 1 x K y K 1 r 3 0 1 . . 0 1 Assumption 1 on f : . . 1 • f non-degenerate: 0 1 . . • x � = x ′ ⇒ f ( x , · ) � = f ( x ′ , · ) . . . a J b L

  13. Revisiting P.S.M. Lowerbound M A M B 0 a 1 b 1 0 Y X 1 a 2 b 2 r 1 0 x 1 y 1 1 0 . 1 . 0 1 x 2 y 2 r 2 0 · · 1 . . 1 · · 0 1 x K y K 1 r 3 0 1 . . 0 1 Assumption 1 on f : . . 1 • f non-degenerate: 0 1 . . • x � = x ′ ⇒ f ( x , · ) � = f ( x ′ , · ) • Simarly for y � = y ′ . . . a J b L

  14. Revisiting P.S.M. Lowerbound M A M B 0 a 1 b 1 0 Y X 1 a 2 b 2 r 1 0 x 1 y 1 1 0 . 1 . 0 1 x 2 y 2 r 2 0 · · 1 . . 1 · · 0 1 x K y K 1 r 3 0 1 . . 0 1 Assumption 1 on f : . . 1 • f non-degenerate: 0 1 . . • x � = x ′ ⇒ f ( x , · ) � = f ( x ′ , · ) • Simarly for y � = y ′ . . . a J Consequence: b L

  15. Revisiting P.S.M. Lowerbound M A M B 0 a 1 b 1 0 Y X 1 a 2 b 2 r 1 0 x 1 y 1 1 0 . 1 . 0 1 x 2 y 2 r 2 0 · · 1 . . 1 · · 0 1 x K y K 1 r 3 0 1 . . 0 1 Assumption 1 on f : . . 1 • f non-degenerate: 0 1 . . • x � = x ′ ⇒ f ( x , · ) � = f ( x ′ , · ) • Simarly for y � = y ′ . . . a J Consequence: • r is one-to-one b L

  16. Revisiting P.S.M. Lowerbound M A M B 0 a 1 b 1 0 Y X 1 a 2 b 2 r 1 0 x 1 y 1 1 0 . 1 . 0 1 x 2 y 2 r 2 0 · · 1 . . 1 · · 0 1 x K y K 1 r 3 0 1 . . 0 1 Useful Edge ( x , y ): . . 1 0 1 . . . . . a J b L

  17. Revisiting P.S.M. Lowerbound M A M B 0 a 1 b 1 0 Y X 1 a 2 b 2 r 1 0 x 1 y 1 1 0 . 1 . 0 1 x 2 y 2 r 2 0 · · 1 . . 1 · · 0 1 x K y K 1 r 3 0 1 . . 0 1 Useful Edge ( x , y ): . . 1 • f ( x , y ) = f ( x , y ) 0 1 . . . . . a J b L

  18. Revisiting P.S.M. Lowerbound M A M B 0 a 1 b 1 0 Y X 1 a 2 b 2 r 1 0 x 1 y 1 1 0 . 1 . 0 1 x 2 y 2 r 2 0 · · 1 . . 1 · · 0 1 x K y K 1 r 3 0 1 . . 0 1 Useful Edge ( x , y ): . . 1 • f ( x , y ) = f ( x , y ) 0 1 . . • x : x with last bit inverted . . . a J b L

  19. Revisiting P.S.M. Lowerbound M A M B 0 a 1 b 1 0 Y X 1 a 2 b 2 r 1 0 x 1 y 1 1 0 . 1 . 0 1 x 2 y 2 r 2 0 · · 1 . . 1 · · 0 1 x K y K 1 r 3 0 1 . . 0 1 Useful Edge ( x , y ): . . 1 • f ( x , y ) = f ( x , y ) 0 1 . . • x : x with last bit inverted . . Assumption 2 on f : . a J b L

  20. Revisiting P.S.M. Lowerbound M A M B 0 a 1 b 1 0 Y X 1 a 2 b 2 r 1 0 x 1 y 1 1 0 . 1 . 0 1 x 2 y 2 r 2 0 · · 1 . . 1 · · 0 1 x K y K 1 r 3 0 1 . . 0 1 Useful Edge ( x , y ): . . 1 • f ( x , y ) = f ( x , y ) 0 1 . . • x : x with last bit inverted . . Assumption 2 on f : • Half of the edges are useful . a J b L

  21. Revisiting P.S.M. Lowerbound M A M B 0 a 1 b 1 0 Y X 1 a 2 b 2 r 1 0 x 1 y 1 1 0 . 1 . 0 1 x 2 y 2 r 2 0 · · 1 . . 1 · · 0 1 x K y K 1 r 3 0 1 . . 0 1 Useful Edge ( x , y ): . . 1 • f ( x , y ) = f ( x , y ) 0 1 . . • x : x with last bit inverted . . Assumption 2 on f : • Half of the edges are useful . a J Consequence: b L

  22. Revisiting P.S.M. Lowerbound M A M B 0 a 1 b 1 0 Y X 1 a 2 b 2 r 1 0 y 1 x 1 1 0 1 . . 0 1 y 2 x 2 r 2 0 · · 1 1 . . · · 0 1 x K y K 1 r 3 0 . 1 . 0 1 . . Useful Edge ( x , y ): 1 • f ( x , y ) = f ( x , y ) 0 1 . . • x : x with last bit inverted . . Assumption 2 on f : • Half of the edges are useful . a J Consequence: b L • Image set has half of f ’s edges

  23. Revisiting P.S.M. Lowerbound r r y x r ˜ r ˜ a b Trivial Overlaps

  24. Revisiting P.S.M. Lowerbound r r y x r ˜ r ˜ a b Trivial Overlaps x r r y r ˜ ˜ a r b x Non-trivial Overlaps

  25. Revisiting P.S.M. Lowerbound r r y x r ˜ r ˜ a b Trivial Overlaps x r r y r ˜ ˜ a r b x Non-trivial Overlaps Y X ′ ◦ 0 Complementary Similar Rectangles X ′ ◦ 1 Truth Table of f

  26. Revisiting P.S.M. Lowerbound r r y x r ˜ r ˜ a b Trivial Overlaps x r r y r ˜ ˜ a r b x Non-trivial Overlaps Y X ′ ◦ 0 Complementary Similar Rectangles X ′ ◦ 1 Assumption 3 on f : Size ≤ 2 · 2 k Truth Table of f

  27. Revisiting P.S.M. Lowerbound r r y x r ˜ r ˜ a b Trivial Overlaps x r r y r ˜ ˜ a r b x Non-trivial Overlaps y x r r ˜ r ˜ r a b ˜ ˜ y x Unaccounted Overlaps

  28. Revisiting P.S.M. Lowerbound r r y x r ˜ r ˜ a b Trivial Overlaps x r r y r ˜ ˜ a r b x Non-trivial Overlaps y x r r ˜ r ˜ r a b ˜ ˜ y x Unaccounted Overlaps Implication:

  29. Revisiting P.S.M. Lowerbound r r y x r ˜ r ˜ a b Trivial Overlaps x r r y r ˜ ˜ a r b x Non-trivial Overlaps y x r r ˜ r ˜ r a b ˜ ˜ y x Unaccounted Overlaps Implication: • Reveal all inputs not required to be private .

  30. Revisiting P.S.M. Lowerbound r r y x r ˜ r ˜ a b Trivial Overlaps x r r y r ˜ ˜ a r b x Non-trivial Overlaps y x r r ˜ r ˜ r a b ˜ ˜ y x Unaccounted Overlaps Implication: • Reveal all inputs not required to be private . • Potentially higher communication cost

  31. Counterexample to P.S.M. Lowerbound � T 0 · x [1 : k − 1] ◦ 0 , � x [ k ] = 0 L ( x ) := , T 1 · x [1 : k − 1] ◦ 1 , x [ k ] = 1 f ( x , y ) = < L ( x ) , y > , T 0 , T 1 , T 0 + T 1 : full rank

  32. Counterexample to P.S.M. Lowerbound � T 0 · x [1 : k − 1] ◦ 0 , � x [ k ] = 0 L ( x ) := , T 1 · x [1 : k − 1] ◦ 1 , x [ k ] = 1 f ( x , y ) = < L ( x ) , y > , T 0 , T 1 , T 0 + T 1 : full rank x , R A C y , R B

  33. Counterexample to P.S.M. Lowerbound � T 0 · x [1 : k − 1] ◦ 0 , � x [ k ] = 0 L ( x ) := , T 1 · x [1 : k − 1] ◦ 1 , x [ k ] = 1 f ( x , y ) = < L ( x ) , y > , T 0 , T 1 , T 0 + T 1 : full rank x , R A L ( x ) C y , R B

  34. Counterexample to P.S.M. Lowerbound � T 0 · x [1 : k − 1] ◦ 0 , � x [ k ] = 0 L ( x ) := , T 1 · x [1 : k − 1] ◦ 1 , x [ k ] = 1 f ( x , y ) = < L ( x ) , y > , T 0 , T 1 , T 0 + T 1 : full rank • PSM for < · , · > x , R A L ( x ) C y , R B

  35. Counterexample to P.S.M. Lowerbound � T 0 · x [1 : k − 1] ◦ 0 , � x [ k ] = 0 L ( x ) := , T 1 · x [1 : k − 1] ◦ 1 , x [ k ] = 1 f ( x , y ) = < L ( x ) , y > , T 0 , T 1 , T 0 + T 1 : full rank • PSM for < · , · > x , R A M A L ( x ) < L ( x ) , y > C M B y , R B

  36. Counterexample to P.S.M. Lowerbound � T 0 · x [1 : k − 1] ◦ 0 , � x [ k ] = 0 L ( x ) := , T 1 · x [1 : k − 1] ◦ 1 , x [ k ] = 1 f ( x , y ) = < L ( x ) , y > , T 0 , T 1 , T 0 + T 1 : full rank • PSM for < · , · > x , R A M A L ( x ) < L ( x ) , y > C M B • Communication: 2k + 2 bits y , R B

  37. New Proof for a Communication Lowerbound • x ∈ X , y ∈ Y , R ∈ { 0 , 1 } ∗ • f : X × Y → Z x , R A M A f ( x , y ) C • Pefect correctness M B • Perfect privacy y , R B

  38. Key idea of the proof M A M B a 0 b 0 Y X y 0 x 0 a 1 b 1 x 1 y 1 a 2 b 2 . . . . . . . x J . y K b ˜ K a ˜ J

  39. Key idea of the proof M A M B a 0 b 0 Y X y 0 x 0 a 1 b 1 x 1 y 1 a 2 b 2 . . . . . . . x J . y K b ˜ K µ ∽ X × Y a ˜ J

  40. Key idea of the proof M A M B a 0 b 0 Y X y 0 x 0 a 1 b 1 x 1 y 1 a 2 b 2 ( X , Y ) . . . . . . . x J . y K b ˜ K µ ∽ X × Y a ˜ J

  41. Key idea of the proof M A M B R a 0 b 0 Y X y 0 x 0 a 1 b 1 x 1 y 1 a 2 b 2 ( X , Y ) . . . . . . . x J . y K b ˜ K µ ∽ X × Y a ˜ J

  42. Key idea of the proof M A M B R a 0 b 0 Y X y 0 x 0 a 1 b 1 x 1 y 1 a 2 b 2 ( X , Y ) . . . . ( X ′ , Y ′ ) . . . x J . y K b ˜ K µ ∽ X × Y a ˜ J

  43. Key idea of the proof M A M B R a 0 b 0 Y X y 0 x 0 a 1 b 1 x 1 y 1 a 2 b 2 ( X , Y ) . . . . ( X ′ , Y ′ ) . . . x J R ′ . y K b ˜ K µ ∽ X × Y a ˜ J

  44. Main Result Theorem Let f : X × Y → Z be non-degenerate and let µ be a distribution on X × Y . Then, PSM ( f ) ≥ log(1 /α ( µ )) + H ∞ ( µ ) − log(1 /β ( µ )) − 1 .

  45. Main Result Theorem Let f : X × Y → Z be non-degenerate and let µ be a distribution on X × Y . Then, PSM ( f ) ≥ log(1 /α ( µ )) + H ∞ ( µ ) − log(1 /β ( µ )) − 1 . Y R 1 α ( µ ) := Volume of disjoint, Similar Rectangles X := ( R 1 , R 2 : similar, disjoint) { min( µ ( R 1 ) , µ ( R 2 )) } max R 2 Truth Table of f

  46. Main Result Theorem Let f : X × Y → Z be non-degenerate and let µ be a distribution on X × Y . Then, PSM ( f ) ≥ log(1 /α ( µ )) + H ∞ ( µ ) − log(1 /β ( µ )) − 1 . H ∞ ( µ ) := Min. Entropy of µ

  47. Main Result Theorem Let f : X × Y → Z be non-degenerate and let µ be a distribution on X × Y . Then, PSM ( f ) ≥ log(1 /α ( µ )) + H ∞ ( µ ) − log(1 /β ( µ )) − 1 . β ( µ ) := Volume of Useful Edges := Pr[( X , Y ) � = ( X ′ , Y ′ ) | f ( X , Y ) = f ( X ′ , Y ′ )]

  48. Special cases Theorem (Boolean function) For non-degenerate f : X × Y → { 0 , 1 } , PSM ( f ) ≥ 2(log |X| + log |Y| ) − log M − 3 .

  49. Special cases Theorem (Boolean function) For non-degenerate f : X × Y → { 0 , 1 } , PSM ( f ) ≥ 2(log |X| + log |Y| ) − log M − 3 . M := ( R 1 , R 2 : similar, disjoint) | R 1 | max

  50. Special cases Theorem (Boolean function) For non-degenerate f : X × Y → { 0 , 1 } , PSM ( f ) ≥ 2(log |X| + log |Y| ) − log M − 3 . Proof. Use µ : uniform distribution. �

  51. Special cases Theorem (Boolean function) For non-degenerate f : X × Y → { 0 , 1 } , PSM ( f ) ≥ 2(log |X| + log |Y| ) − log M − 3 . Corollary (Random function) For a random, boolean f : { 0 , 1 } k × { 0 , 1 } k → { 0 , 1 } , w.h.p., PSM ( f ) ≥ 3 k − 2 log k − 1 .

  52. Special cases Theorem (Boolean function) For non-degenerate f : X × Y → { 0 , 1 } , PSM ( f ) ≥ 2(log |X| + log |Y| ) − log M − 3 . Corollary (Random function) For a random, boolean f : { 0 , 1 } k × { 0 , 1 } k → { 0 , 1 } , w.h.p., PSM ( f ) ≥ 3 k − 2 log k − 1 . Proof. W.h.p., M ≤ k 2 · 2 k . �

  53. Special cases Theorem (Boolean function) For non-degenerate f : X × Y → { 0 , 1 } , PSM ( f ) ≥ 2(log |X| + log |Y| ) − log M − 3 . Corollary (Random function) For a random, boolean f : { 0 , 1 } k × { 0 , 1 } k → { 0 , 1 } , w.h.p., PSM ( f ) ≥ 3 k − 2 log k − 1 . Theorem (Explicit functions) f k : { 0 , 1 } k × { 0 , 1 } k → { 0 , 1 } � � ∃ k for which PSM ( f k ) ≥ 3 k − O (log k )

  54. Special cases Theorem (Boolean function) For non-degenerate f : X × Y → { 0 , 1 } , PSM ( f ) ≥ 2(log |X| + log |Y| ) − log M − 3 . Corollary (Random function) For a random, boolean f : { 0 , 1 } k × { 0 , 1 } k → { 0 , 1 } , w.h.p., PSM ( f ) ≥ 3 k − 2 log k − 1 . Theorem (Explicit functions) f k : { 0 , 1 } k × { 0 , 1 } k → { 0 , 1 } � � ∃ k for which PSM ( f k ) ≥ 3 k − O (log k ) Proof. Suffices to sample f k from poly(k)-wise independent distribution. �

  55. Conditional Disclosure of a Secret (C.D.S.) • x ∈ X , y ∈ Y • s ∈ { 0 , 1 } x , s A C ( x , y ) y B

  56. Conditional Disclosure of a Secret (C.D.S.) • x ∈ X , y ∈ Y • s ∈ { 0 , 1 } x , s • h : X × Y → { 0 , 1 } A s iff h ( x , y ) = 1 C ( x , y ) y B

  57. Conditional Disclosure of a Secret (C.D.S.) • x ∈ X , y ∈ Y • s ∈ { 0 , 1 } x , s • h : X × Y → { 0 , 1 } A s iff h ( x , y ) = 1 C ( x , y ) • Pefect correctness y B • Perfect privacy

  58. Conditional Disclosure of a Secret (C.D.S.) • x ∈ X , y ∈ Y • s ∈ { 0 , 1 } x , s , R • h : X × Y → { 0 , 1 } A • R ∈ { 0 , 1 } ∗ s iff h ( x , y ) = 1 C ( x , y ) • Pefect correctness y , R B • Perfect privacy

  59. Conditional Disclosure of a Secret (C.D.S.) • x ∈ X , y ∈ Y • s ∈ { 0 , 1 } x , s , R • h : X × Y → { 0 , 1 } A • R ∈ { 0 , 1 } ∗ M A s iff h ( x , y ) = 1 C ( x , y ) M B • Pefect correctness y , R B • Perfect privacy

  60. Conditional Disclosure of a Secret (C.D.S.) • x ∈ X , y ∈ Y • s ∈ { 0 , 1 } x , s , R • h : X × Y → { 0 , 1 } A • R ∈ { 0 , 1 } ∗ M A s iff h ( x , y ) = 1 C ( x , y ) M B • Pefect correctness y , R B • Perfect privacy • Useful applications: unconditionally private information retrieval (P.I.R.), priced O.T., secret sharing for graph-based access structures, attribute-based encryption

  61. Conditional Disclosure of a Secret (C.D.S.) • x ∈ X , y ∈ Y • s ∈ { 0 , 1 } x , s , R • h : X × Y → { 0 , 1 } A • R ∈ { 0 , 1 } ∗ M A s iff h ( x , y ) = 1 C ( x , y ) M B • Pefect correctness y , R B • Perfect privacy • Communication lowerbound : • Ω(log k ) for several explicit predicates (Gay et al., CRYPTO, 2015)

  62. Conditional Disclosure of a Secret (C.D.S.) • x ∈ X , y ∈ Y • s ∈ { 0 , 1 } x , s , R • h : X × Y → { 0 , 1 } A • R ∈ { 0 , 1 } ∗ M A s iff h ( x , y ) = 1 C ( x , y ) M B • Pefect correctness y , R B • Perfect privacy • Communication lowerbound : • Ω(log k ) for several explicit predicates (Gay et al., CRYPTO, 2015) • k − o ( k ) for some non-explicit predicate (Applebaum et al., CRYPTO, 2017)

  63. C.D.S. Lowerbound Theorem For predicate h : X × Y → { 0 , 1 } , CDS ( h ) ≥ 2 log | h − 1 (0) | − log M − log |X| − log |Y| − 1 .

  64. C.D.S. Lowerbound Theorem For predicate h : X × Y → { 0 , 1 } , CDS ( h ) ≥ 2 log | h − 1 (0) | − log M − log |X| − log |Y| − 1 . | h − 1 (0) | := Number of 0-inputs of h

  65. C.D.S. Lowerbound Theorem For predicate h : X × Y → { 0 , 1 } , CDS ( h ) ≥ 2 log | h − 1 (0) | − log M − log |X| − log |Y| − 1 . Y M := Size of largest 0-monochromatic rectangle of h X All 0’s Truth Table of h

  66. C.D.S. Lowerbound : Special Cases Corollary (CDS for Inner Product) For predicate h ( x , y ) = < x , y > , x , y ∈ { 0 , 1 } k , CDS ( h ) ≥ k − 3 − o (1) .

  67. C.D.S. Lowerbound : Special Cases Corollary (CDS for Inner Product) For predicate h ( x , y ) = < x , y > , x , y ∈ { 0 , 1 } k , CDS ( h ) ≥ k − 3 − o (1) . Remarks: • Tight bound. • Previous bound: Ω(log k ).

  68. C.D.S. Lowerbound : Special Cases Corollary (CDS for Inner Product) For predicate h ( x , y ) = < x , y > , x , y ∈ { 0 , 1 } k , CDS ( h ) ≥ k − 3 − o (1) . Remarks: • Tight bound. • Previous bound: Ω(log k ). Corollary (CDS for Random predicate) For a random predicate h : { 0 , 1 } k × { 0 , 1 } k → { 0 , 1 } , w.h.p., CDS ( h ) ≥ k − 4 − o (1) .

  69. Summary We revisited the P.S.M. lowerbound of Feige, Kilian, Naor(FKN) (STOC, 1994) and proved the following results:

  70. Summary We revisited the P.S.M. lowerbound of Feige, Kilian, Naor(FKN) (STOC, 1994) and proved the following results: • Counterexample: an f whose P.S.M. communicates only 2 k + 2 bits.

Recommend


More recommend