information complexity density and simulation of protocols
play

Information Complexity Density and Simulation of Protocols - PowerPoint PPT Presentation

Information Complexity Density and Simulation of Protocols Himanshu Tyagi Indian Institute of Science, Bangalore with Pramod Viswanath (UIUC), Shaileshh Venkatakrishnan (UIUC), and Shun Watanabe (TUAT) Private Coin Interactive Protocols X Y


  1. Information Complexity Density and Simulation of Protocols Himanshu Tyagi Indian Institute of Science, Bangalore with Pramod Viswanath (UIUC), Shaileshh Venkatakrishnan (UIUC), and Shun Watanabe (TUAT)

  2. Private Coin Interactive Protocols X Y 1

  3. Private Coin Interactive Protocols π X Y 1

  4. Private Coin Interactive Protocols π X Y Denote by Π = (Π 1 , Π 2 , Π 3 , ... ) the random transcript 1

  5. Private Coin Interactive Protocols π X Y Denote by Π = (Π 1 , Π 2 , Π 3 , ... ) the random transcript Π 1 — X — Y Π 2 — Y, Π 1 — X Π 3 — X, Π 1 , Π 2 — Y · · · 1

  6. Private Coin Interactive Protocols π X Y Denote by Π = (Π 1 , Π 2 , Π 3 , ... ) the random transcript Π 1 — X — Y Π 2 — Y, Π 1 — X Π 3 — X, Π 1 , Π 2 — Y · · · | π | = depth of the protocol tree 1

  7. ǫ -Simulation of a Protocol π π sim X Y X Y Π y Π x Π Definition A protocol π sim constitutes an ǫ -simulation of π if it can produce outputs Π x and Π y at X and Y , respectively, such that � � � P XY ΠΠ − P XY Π x Π y TV ≤ ǫ. � 2

  8. ǫ -Simulation of a Protocol π π sim X Y X Y Π y Π x Π Definition A protocol π sim constitutes an ǫ -simulation of π if it can produce outputs Π x and Π y at X and Y , respectively, such that � � � P XY ΠΠ − P XY Π x Π y TV ≤ ǫ. � We seek to characterize D ǫ ( π | P XY ) = min. length of an ǫ -simulation of π 2

  9. ǫ -Compression of a Protocol π π com Y X X Y Π y Π x Π Definition A protocol π com constitutes an ǫ -compression of π if it can produce outputs Π x and Π y at X and Y , respectively, such that Pr (Π = Π x = Π y ) ≥ 1 − ǫ. 3

  10. ǫ -Compression of a Protocol π π com Y X X Y Π y Π x Π Definition A protocol π com constitutes an ǫ -compression of π if it can produce outputs Π x and Π y at X and Y , respectively, such that Pr (Π = Π x = Π y ) ≥ 1 − ǫ. For deterministic protocols, compression ≡ simulation. 3

  11. Information Complexity of π def IC ( π ) = I (Π ∧ X | Y ) + I (Π ∧ Y | X ) 4

  12. Information Complexity of π def IC ( π ) = I (Π ∧ X | Y ) + I (Π ∧ Y | X ) Examples ◮ Π( x, y ) = x IC ( π ) = H ( X | Y ) ◮ Π( x, y ) = ( x, y ) IC ( π ) = H ( X | Y ) + H ( Y | X ) 4

  13. Information Complexity of π def IC ( π ) = I (Π ∧ X | Y ) + I (Π ∧ Y | X ) Examples ◮ Π( x, y ) = x IC ( π ) = H ( X | Y ) ◮ Π( x, y ) = ( x, y ) IC ( π ) = H ( X | Y ) + H ( Y | X ) Theorem (Amortized Communication Complexity [BR’10] ) For coordinate-wise repetition π n of π and i.i.d. ( X n , Y n ) , 1 nD ǫ ( π n | P X n Y n ) = IC ( π ) . lim ǫ → 0 lim n →∞ 4

  14. Information Complexity of π def IC ( π ) = I (Π ∧ X | Y ) + I (Π ∧ Y | X ) Examples ◮ Π( x, y ) = x [Slepian-Wolf ’74] IC ( π ) = H ( X | Y ) ◮ Π( x, y ) = ( x, y ) [Csiszár-Narayan ’04] IC ( π ) = H ( X | Y ) + H ( Y | X ) Theorem (Amortized Communication Complexity [BR’10] ) For coordinate-wise repetition π n of π and i.i.d. ( X n , Y n ) , 1 nD ǫ ( π n | P X n Y n ) = IC ( π ) . lim ǫ → 0 lim n →∞ 4

  15. Questions ◮ Strong converse. Does lim n →∞ 1 n D ǫ ( π n | P X n Y n ) depend on ǫ ? ◮ Mixed protocols. What about a mixed protocol π ( n ) given by � π n h , w.p. p, π ( n ) = π n l , w.p. 1 − p. Note that IC ( π ( n ) ) = n � � p IC ( π h ) + (1 − p ) IC ( π l ) ◮ ... General distributions? Second-order asymptotics? Single-shot? 5

  16. Questions ◮ Strong converse. Does lim n →∞ 1 n D ǫ ( π n | P X n Y n ) depend on ǫ ? ◮ Mixed protocols. What about a mixed protocol π ( n ) given by � π n h , w.p. p, π ( n ) = π n l , w.p. 1 − p. Note that IC ( π ( n ) ) = n � � p IC ( π h ) + (1 − p ) IC ( π l ) ◮ ... General distributions? Second-order asymptotics? Single-shot? Why do we care? 5

  17. Questions ◮ Strong converse. Does lim n →∞ 1 n D ǫ ( π n | P X n Y n ) depend on ǫ ? ◮ Mixed protocols. What about a mixed protocol π ( n ) given by � π n h , w.p. p, π ( n ) = π n l , w.p. 1 − p. Note that IC ( π ( n ) ) = n � � p IC ( π h ) + (1 − p ) IC ( π l ) ◮ ... General distributions? Second-order asymptotics? Single-shot? Why do we care? 42. 5

  18. The Tail of Information Complexity Density

  19. Information Complexity Density = log P Π | XY ( τ | x, y ) + log P Π | XY ( τ | x, y ) def ic ( τ ; x, y ) P Π | X ( τ | x ) P Π | Y ( τ | y ) Note that E [ ic (Π; X, Y )] = IC ( π ) . 7

  20. Information Complexity Density = log P Π | XY ( τ | x, y ) + log P Π | XY ( τ | x, y ) def ic ( τ ; x, y ) P Π | X ( τ | x ) P Π | Y ( τ | y ) Note that E [ ic (Π; X, Y )] = IC ( π ) . ǫ -Tails of ic (Π; X, Y ) are closely related to D ǫ ( π | P XY ) 7

  21. Illustration Consider the Slepian-Wolf problem ( Π( x, y ) = x ). ◮ ic ( τ ; x, y ) = − log P X | Y ( x | y ) 8

  22. Illustration Consider the Slepian-Wolf problem ( Π( x, y ) = x ). ◮ ic ( τ ; x, y ) = − log P X | Y ( x | y ) ◮ If Pr ( ic (Π; X, Y ) ≥ λ ) ≤ ǫ , - a random hash λ -bit hash of X constitutes an ǫ -compression. ◮ If Pr ( ic (Π; X, Y ) ≥ λ ) > ǫ , - any subset with prob. ≥ 1 − ǫ has cardinality less than λ 8

  23. Illustration Consider the Slepian-Wolf problem ( Π( x, y ) = x ). ◮ ic ( τ ; x, y ) = − log P X | Y ( x | y ) ◮ If Pr ( ic (Π; X, Y ) ≥ λ ) ≤ ǫ , - a random hash λ -bit hash of X constitutes an ǫ -compression. ◮ If Pr ( ic (Π; X, Y ) ≥ λ ) > ǫ , - any subset with prob. ≥ 1 − ǫ has cardinality less than λ Prob. > � Prob. ≤ � Spectrum of h ( X | Y ) = − log P X | Y ( X | Y ) 8

  24. Main Results

  25. Lower Bound Theorem Given 0 ≤ ǫ < 1 and a protocol π , D ǫ ( π ) � sup { λ : Pr ( ic (Π; X, Y ) > λ ) ≥ ǫ } . 10

  26. Lower Bound Theorem Given 0 ≤ ǫ < 1 and a protocol π , D ǫ ( π ) � sup { λ : Pr ( ic (Π; X, Y ) > λ ) ≥ ǫ } . Weaknesses. ◮ The fudge parameters are of the order log( spectrum width ) . ◮ Uses only the joint pmf, not the structure of the protocol. 10

  27. Upper bound Theorem Given 0 ≤ ǫ < 1 and a bounded rounds protocol π , D ǫ ( π ) � sup { λ : Pr ( ic (Π; X, Y ) > λ ) ≤ ǫ } . Distribution of ic (Π; X, Y ) Pr( ic (Π; X, Y )) > λ ) > � Pr( ic (Π; X, Y )) > λ ) ≤ � Lower bound Upper Bound 11

  28. Upper bound Theorem Given 0 ≤ ǫ < 1 and a bounded rounds protocol π , D ǫ ( π ) � sup { λ : Pr ( ic (Π; X, Y ) > λ ) ≤ ǫ } . Distribution of ic (Π; X, Y ) Pr( ic (Π; X, Y )) > λ ) > � Pr( ic (Π; X, Y )) > λ ) ≤ � Lower bound Upper Bound Weaknesses. ◮ The fudge parameters depend on the number of rounds. ◮ Protocol based on round-by-round compression. 11

  29. Questions ◮ Strong converse. Does lim n →∞ 1 n D ǫ ( π n | P X n Y n ) depend on ǫ ? ◮ Mixed protocols. What about a mixed protocol π ( n ) given by � π n h , w.p. p, π ( n ) = π n l , w.p. 1 − p. Note that IC ( π ( n ) ) = n � � p IC ( π h ) + (1 − p ) IC ( π l ) 12

  30. Questions ◮ Strong converse. Does lim n →∞ 1 n D ǫ ( π n | P X n Y n ) depend on ǫ ? Answer. No. In fact, n V ( ic (Π; X, Y )) Q − 1 ( ǫ ) + o ( √ n ) � D ǫ ( π n ) = n IC ( π ) + ◮ Mixed protocols. What about a mixed protocol π ( n ) given by � π n h , w.p. p, π ( n ) = π n l , w.p. 1 − p. Note that IC ( π ( n ) ) = n � � p IC ( π h ) + (1 − p ) IC ( π l ) 12

  31. Questions ◮ Strong converse. Does lim n →∞ 1 n D ǫ ( π n | P X n Y n ) depend on ǫ ? Answer. No. In fact, n V ( ic (Π; X, Y )) Q − 1 ( ǫ ) + o ( √ n ) � D ǫ ( π n ) = n IC ( π ) + ◮ Mixed protocols. What about a mixed protocol π ( n ) given by � π n h , w.p. p, π ( n ) = π n l , w.p. 1 − p. Note that IC ( π ( n ) ) = n � � p IC ( π h ) + (1 − p ) IC ( π l ) Answer. 1 nD ǫ ( π ( n ) ) = IC ( π h ) lim ǫ → 0 lim sup n →∞ 12

  32. 42 Function Computation [BR ’10], [MI ’10]: 1 nD ǫ ( f n ) = IC ( f ) . ǫ → 0 lim lim n → 13

  33. 42 Function Computation [BR ’10], [MI ’10]: 1 nD ǫ ( f n ) = IC ( f ) . ǫ → 0 lim lim n → ◮ Strong converse? Our bound yields 1 nD ǫ ( f n ) ≥ H ( f ( X, Y ) | X ) + H ( f ( X, Y ) | Y ) lim n → 13

  34. 42 Function Computation [BR ’10], [MI ’10]: 1 nD ǫ ( f n ) = IC ( f ) . ǫ → 0 lim lim n → ◮ Strong converse? Our bound yields 1 nD ǫ ( f n ) ≥ H ( f ( X, Y ) | X ) + H ( f ( X, Y ) | Y ) lim n → ◮ Direct product or Arimoto converse? [BRWY ’13], [BW’14]: n IC ( f ) poly (log n ) ⇒ Pr ( F = F x = F y ) ≤ e − nc ∀ n large | π n | < 13

  35. 42 Function Computation [BR ’10], [MI ’10]: 1 nD ǫ ( f n ) = IC ( f ) . ǫ → 0 lim lim n → ◮ Strong converse? Our bound yields 1 nD ǫ ( f n ) ≥ H ( f ( X, Y ) | X ) + H ( f ( X, Y ) | Y ) lim n → ◮ Direct product or Arimoto converse? [BRWY ’13], [BW’14]: n IC ( f ) poly (log n ) ⇒ Pr ( F = F x = F y ) ≤ e − nc ∀ n large | π n | < Our bound yields a threshold of n [ H ( F | X ) + H ( F | Y )] . 13

  36. 42 Separation of D ǫ ( π ) and IC ( π ) [BBCR ’10]: D ǫ ( π ) ≤ ˜ � O ( | π | IC ( π )) [B ’12]: D ǫ ( π ) ≤ 2 O ( IC ( π )) 14

Recommend


More recommend