Information Complexity Density and Simulation of Protocols - - PowerPoint PPT Presentation

information complexity density and simulation of protocols
SMART_READER_LITE
LIVE PREVIEW

Information Complexity Density and Simulation of Protocols - - PowerPoint PPT Presentation

Information Complexity Density and Simulation of Protocols Himanshu Tyagi Indian Institute of Science, Bangalore with Pramod Viswanath (UIUC), Shaileshh Venkatakrishnan (UIUC), and Shun Watanabe (TUAT) Private Coin Interactive Protocols X Y


slide-1
SLIDE 1

Information Complexity Density and Simulation of Protocols

Himanshu Tyagi Indian Institute of Science, Bangalore with Pramod Viswanath (UIUC), Shaileshh Venkatakrishnan (UIUC), and Shun Watanabe (TUAT)

slide-2
SLIDE 2

Private Coin Interactive Protocols

Y X

1

slide-3
SLIDE 3

Private Coin Interactive Protocols

Y X π

1

slide-4
SLIDE 4

Private Coin Interactive Protocols

Y X π

Denote by Π = (Π1, Π2, Π3, ...) the random transcript

1

slide-5
SLIDE 5

Private Coin Interactive Protocols

Y X π

Denote by Π = (Π1, Π2, Π3, ...) the random transcript Π1—X—Y Π2—Y, Π1—X Π3—X, Π1, Π2—Y · · ·

1

slide-6
SLIDE 6

Private Coin Interactive Protocols

Y X π

Denote by Π = (Π1, Π2, Π3, ...) the random transcript Π1—X—Y Π2—Y, Π1—X Π3—X, Π1, Π2—Y · · · |π| = depth of the protocol tree

1

slide-7
SLIDE 7

ǫ-Simulation of a Protocol

Πx Πy πsim Y X Π π Y X

Definition

A protocol πsim constitutes an ǫ-simulation of π if it can produce outputs Πx and Πy at X and Y , respectively, such that

  • PXY ΠΠ − PXY ΠxΠy
  • TV ≤ ǫ.

2

slide-8
SLIDE 8

ǫ-Simulation of a Protocol

Πx Πy πsim Y X Π π Y X

Definition

A protocol πsim constitutes an ǫ-simulation of π if it can produce outputs Πx and Πy at X and Y , respectively, such that

  • PXY ΠΠ − PXY ΠxΠy
  • TV ≤ ǫ.

We seek to characterize Dǫ(π|PXY )= min. length of an ǫ-simulation of π

2

slide-9
SLIDE 9

ǫ-Compression of a Protocol

Πx Πy πcom Y X Π π Y X

Definition

A protocol πcom constitutes an ǫ-compression of π if it can produce

  • utputs Πx and Πy at X and Y , respectively, such that

Pr (Π = Πx = Πy) ≥ 1 − ǫ.

3

slide-10
SLIDE 10

ǫ-Compression of a Protocol

Πx Πy πcom Y X Π π Y X

Definition

A protocol πcom constitutes an ǫ-compression of π if it can produce

  • utputs Πx and Πy at X and Y , respectively, such that

Pr (Π = Πx = Πy) ≥ 1 − ǫ. For deterministic protocols, compression ≡ simulation.

3

slide-11
SLIDE 11

Information Complexity of π

IC(π)

def

= I(Π ∧ X | Y ) + I(Π ∧ Y | X)

4

slide-12
SLIDE 12

Information Complexity of π

IC(π)

def

= I(Π ∧ X | Y ) + I(Π ∧ Y | X) Examples

◮ Π(x, y) = x

IC(π) = H(X|Y )

◮ Π(x, y) = (x, y)

IC(π) = H(X|Y ) + H(Y |X)

4

slide-13
SLIDE 13

Information Complexity of π

IC(π)

def

= I(Π ∧ X | Y ) + I(Π ∧ Y | X) Examples

◮ Π(x, y) = x

IC(π) = H(X|Y )

◮ Π(x, y) = (x, y)

IC(π) = H(X|Y ) + H(Y |X)

Theorem (Amortized Communication Complexity [BR’10] )

For coordinate-wise repetition πn of π and i.i.d. (Xn, Y n), lim

ǫ→0 lim n→∞

1 nDǫ (πn|PXnY n) = IC(π).

4

slide-14
SLIDE 14

Information Complexity of π

IC(π)

def

= I(Π ∧ X | Y ) + I(Π ∧ Y | X) Examples

◮ Π(x, y) = x [Slepian-Wolf ’74]

IC(π) = H(X|Y )

◮ Π(x, y) = (x, y) [Csiszár-Narayan ’04]

IC(π) = H(X|Y ) + H(Y |X)

Theorem (Amortized Communication Complexity [BR’10] )

For coordinate-wise repetition πn of π and i.i.d. (Xn, Y n), lim

ǫ→0 lim n→∞

1 nDǫ (πn|PXnY n) = IC(π).

4

slide-15
SLIDE 15

Questions

◮ Strong converse. Does limn→∞ 1 nDǫ (πn|PXnY n) depend on ǫ? ◮ Mixed protocols. What about a mixed protocol π(n) given by

π(n) =

  • πn

h ,

w.p. p, πn

l ,

w.p. 1 − p. Note that IC(π(n)) = n

  • pIC(πh) + (1 − p)IC(πl)
  • ◮ ... General distributions? Second-order asymptotics? Single-shot?

5

slide-16
SLIDE 16

Questions

◮ Strong converse. Does limn→∞ 1 nDǫ (πn|PXnY n) depend on ǫ? ◮ Mixed protocols. What about a mixed protocol π(n) given by

π(n) =

  • πn

h ,

w.p. p, πn

l ,

w.p. 1 − p. Note that IC(π(n)) = n

  • pIC(πh) + (1 − p)IC(πl)
  • ◮ ... General distributions? Second-order asymptotics? Single-shot?

Why do we care?

5

slide-17
SLIDE 17

Questions

◮ Strong converse. Does limn→∞ 1 nDǫ (πn|PXnY n) depend on ǫ? ◮ Mixed protocols. What about a mixed protocol π(n) given by

π(n) =

  • πn

h ,

w.p. p, πn

l ,

w.p. 1 − p. Note that IC(π(n)) = n

  • pIC(πh) + (1 − p)IC(πl)
  • ◮ ... General distributions? Second-order asymptotics? Single-shot?

Why do we care? 42.

5

slide-18
SLIDE 18

The Tail of Information Complexity Density

slide-19
SLIDE 19

Information Complexity Density

ic(τ; x, y)

def

= log PΠ|XY (τ|x, y) PΠ|X (τ|x) + log PΠ|XY (τ|x, y) PΠ|Y (τ|y)

Note that E[ic(Π; X, Y )] = IC(π).

7

slide-20
SLIDE 20

Information Complexity Density

ic(τ; x, y)

def

= log PΠ|XY (τ|x, y) PΠ|X (τ|x) + log PΠ|XY (τ|x, y) PΠ|Y (τ|y)

Note that E[ic(Π; X, Y )] = IC(π).

ǫ-Tails of ic(Π; X, Y ) are closely related to Dǫ(π|PXY )

7

slide-21
SLIDE 21

Illustration

Consider the Slepian-Wolf problem (Π(x, y) = x).

◮ ic(τ; x, y) = − log PX|Y (x|y)

8

slide-22
SLIDE 22

Illustration

Consider the Slepian-Wolf problem (Π(x, y) = x).

◮ ic(τ; x, y) = − log PX|Y (x|y) ◮ If Pr (ic(Π; X, Y ) ≥ λ) ≤ ǫ,

  • a random hash λ-bit hash of X constitutes an ǫ-compression.

◮ If Pr (ic(Π; X, Y ) ≥ λ) > ǫ,

  • any subset with prob. ≥ 1 − ǫ has cardinality less than λ

8

slide-23
SLIDE 23

Illustration

Consider the Slepian-Wolf problem (Π(x, y) = x).

◮ ic(τ; x, y) = − log PX|Y (x|y) ◮ If Pr (ic(Π; X, Y ) ≥ λ) ≤ ǫ,

  • a random hash λ-bit hash of X constitutes an ǫ-compression.

◮ If Pr (ic(Π; X, Y ) ≥ λ) > ǫ,

  • any subset with prob. ≥ 1 − ǫ has cardinality less than λ

Spectrum of

  • Prob. >

h(X|Y ) = − log PX|Y (X|Y )

  • Prob. ≤

8

slide-24
SLIDE 24

Main Results

slide-25
SLIDE 25

Lower Bound

Theorem

Given 0 ≤ ǫ < 1 and a protocol π, Dǫ(π) sup{λ : Pr (ic(Π; X, Y ) > λ) ≥ ǫ}.

10

slide-26
SLIDE 26

Lower Bound

Theorem

Given 0 ≤ ǫ < 1 and a protocol π, Dǫ(π) sup{λ : Pr (ic(Π; X, Y ) > λ) ≥ ǫ}. Weaknesses.

◮ The fudge parameters are of the order log( spectrum width ). ◮ Uses only the joint pmf, not the structure of the protocol.

10

slide-27
SLIDE 27

Upper bound

Theorem

Given 0 ≤ ǫ < 1 and a bounded rounds protocol π, Dǫ(π) sup{λ : Pr (ic(Π; X, Y ) > λ) ≤ ǫ}.

Lower bound Upper Bound Pr(ic(Π; X, Y )) > λ) ≤ Pr(ic(Π; X, Y )) > λ) > Distribution of ic(Π; X, Y )

11

slide-28
SLIDE 28

Upper bound

Theorem

Given 0 ≤ ǫ < 1 and a bounded rounds protocol π, Dǫ(π) sup{λ : Pr (ic(Π; X, Y ) > λ) ≤ ǫ}.

Lower bound Upper Bound Pr(ic(Π; X, Y )) > λ) ≤ Pr(ic(Π; X, Y )) > λ) > Distribution of ic(Π; X, Y )

Weaknesses.

◮ The fudge parameters depend on the number of rounds. ◮ Protocol based on round-by-round compression.

11

slide-29
SLIDE 29

Questions

◮ Strong converse. Does limn→∞ 1 nDǫ (πn|PXnY n) depend on ǫ? ◮ Mixed protocols. What about a mixed protocol π(n) given by

π(n) =

  • πn

h ,

w.p. p, πn

l ,

w.p. 1 − p. Note that IC(π(n)) = n

  • pIC(πh) + (1 − p)IC(πl)
  • 12
slide-30
SLIDE 30

Questions

◮ Strong converse. Does limn→∞ 1 nDǫ (πn|PXnY n) depend on ǫ?

  • Answer. No. In fact,

Dǫ(πn) = nIC(π) +

  • nV (ic(Π; X, Y ))Q−1(ǫ) + o(√n)

◮ Mixed protocols. What about a mixed protocol π(n) given by

π(n) =

  • πn

h ,

w.p. p, πn

l ,

w.p. 1 − p. Note that IC(π(n)) = n

  • pIC(πh) + (1 − p)IC(πl)
  • 12
slide-31
SLIDE 31

Questions

◮ Strong converse. Does limn→∞ 1 nDǫ (πn|PXnY n) depend on ǫ?

  • Answer. No. In fact,

Dǫ(πn) = nIC(π) +

  • nV (ic(Π; X, Y ))Q−1(ǫ) + o(√n)

◮ Mixed protocols. What about a mixed protocol π(n) given by

π(n) =

  • πn

h ,

w.p. p, πn

l ,

w.p. 1 − p. Note that IC(π(n)) = n

  • pIC(πh) + (1 − p)IC(πl)
  • Answer.

lim

ǫ→0 lim sup n→∞

1 nDǫ(π(n)) = IC(πh)

12

slide-32
SLIDE 32

42

Function Computation [BR ’10], [MI ’10]: lim

ǫ→0 lim n→

1 nDǫ(f n) = IC(f).

13

slide-33
SLIDE 33

42

Function Computation [BR ’10], [MI ’10]: lim

ǫ→0 lim n→

1 nDǫ(f n) = IC(f).

◮ Strong converse? Our bound yields

lim

n→

1 nDǫ(f n) ≥ H(f(X, Y )|X) + H(f(X, Y )|Y )

13

slide-34
SLIDE 34

42

Function Computation [BR ’10], [MI ’10]: lim

ǫ→0 lim n→

1 nDǫ(f n) = IC(f).

◮ Strong converse? Our bound yields

lim

n→

1 nDǫ(f n) ≥ H(f(X, Y )|X) + H(f(X, Y )|Y )

◮ Direct product or Arimoto converse?

[BRWY ’13], [BW’14]: |πn| < nIC(f) poly(log n) ⇒ Pr (F = Fx = Fy) ≤ e−nc ∀n large

13

slide-35
SLIDE 35

42

Function Computation [BR ’10], [MI ’10]: lim

ǫ→0 lim n→

1 nDǫ(f n) = IC(f).

◮ Strong converse? Our bound yields

lim

n→

1 nDǫ(f n) ≥ H(f(X, Y )|X) + H(f(X, Y )|Y )

◮ Direct product or Arimoto converse?

[BRWY ’13], [BW’14]: |πn| < nIC(f) poly(log n) ⇒ Pr (F = Fx = Fy) ≤ e−nc ∀n large Our bound yields a threshold of n[H(F|X) + H(F|Y )].

13

slide-36
SLIDE 36

42

Separation of Dǫ(π) and IC(π) [BBCR ’10]: Dǫ(π) ≤ ˜ O(

  • |π|IC(π))

[B ’12]: Dǫ(π) ≤ 2O(IC(π))

14

slide-37
SLIDE 37

42

Separation of Dǫ(π) and IC(π) [BBCR ’10]: Dǫ(π) ≤ ˜ O(

  • |π|IC(π))

[B ’12]: Dǫ(π) ≤ 2O(IC(π)) Arbitrary separation possible for vanishing ǫ π(x, y) =        a if x > δ2n, y > δ2n b if x > δ2n, y ≤ δ2n c if x ≤ δ2n, y > δ2n (x, y) if x ≤ δ2n, y ≤ δ2n For (X, Y ) random n-bit strings, δ = 1/n, and ǫ = 1/n2 IC(π) = O(n−2) ≪ Dǫ(π) = Ω(2n).

14

slide-38
SLIDE 38

42

Separation of Dǫ(π) and IC(π) [BBCR ’10]: Dǫ(π) ≤ ˜ O(

  • |π|IC(π))

[B ’12]: Dǫ(π) ≤ 2O(IC(π)) Arbitrary separation possible for vanishing ǫ π(x, y) =        a if x > δ2n, y > δ2n b if x > δ2n, y ≤ δ2n c if x ≤ δ2n, y > δ2n (x, y) if x ≤ δ2n, y ≤ δ2n For (X, Y ) random n-bit strings, δ = 1/n, and ǫ = 1/n2 IC(π) = O(n−2) ≪ Dǫ(π) = Ω(2n). [GKR ’13]: example with exponential separation even for ǫ fixed!

14

slide-39
SLIDE 39

Proof Sketch

slide-40
SLIDE 40

Simulaltion Scheme: The Compression Step

min(Π1|Y )

∆ Tj h(Π1|Y )

hi ≡    Send Hξ

min(Π1|Y )-bit random hash of Π1,

i = 1, Send ∆-bit random hash of Π1, 2 ≤ i ≤ N. First party sends hash bits hi(t) successively until it receives an ACK

  • r

i = N Second party sends an ACK when it finds an ˆ t s.t. (ˆ t, y) ∈ Ti and hj(ˆ t) = hj(t), 1 ≤ j ≤ i.

16

slide-41
SLIDE 41

Simulaltion Scheme: Compression to Simulation

◮ Generate Π1 s.t. public coins can be treated as a hash of Π1. ◮ Since this hash must be independent of (X, Y ), can do this only for

Hmin(Π1|XY ) = Hmin(Π1|X) bits .

◮ Reduces the number of bits to be communicated from h(Π1|Y ) to

h(Π1|Y ) − h(Π1|X).

17

slide-42
SLIDE 42

Lower Bound Proof: Super Sparse Version

◮ Based on reduction to secret key agreement with public discussion. ◮ We can compress since the parties agree on more bits L than the

communicated bits R.

◮ S ≡ max. length of a secret key that can be generated

L − R ≤ S ⇔ L − S ≤ R.

18

slide-43
SLIDE 43

In closing ...

Information spectrum method is a promising approach for studying communication complexity

Open Problems:

◮ Strong converse and Arimoto converse for function computation ◮ Converse for [BBCR’10] ◮ Practical/universal versions of simulation algorithms ◮ Multiparty version

19