data streams communication complexity
play

Data Streams & Communication Complexity Lecture 3: Communication - PowerPoint PPT Presentation

Data Streams & Communication Complexity Lecture 3: Communication Complexity and Lower Bounds Andrew McGregor, UMass Amherst 1/23 Basic Communication Complexity Three friends Alice, Bob, and Charlie each have some information x , y , z


  1. Data Streams & Communication Complexity Lecture 3: Communication Complexity and Lower Bounds Andrew McGregor, UMass Amherst 1/23

  2. Basic Communication Complexity ◮ Three friends Alice, Bob, and Charlie each have some information x , y , z and Charlie wants to compute some function P ( x , y , z ). 2/23

  3. Basic Communication Complexity ◮ Three friends Alice, Bob, and Charlie each have some information x , y , z and Charlie wants to compute some function P ( x , y , z ). m 1 m 2 out x y z ◮ To help Charlie, Alice sends a message m 1 to Bob, and then Bob sends a message m 2 to Charlie. 2/23

  4. Basic Communication Complexity ◮ Three friends Alice, Bob, and Charlie each have some information x , y , z and Charlie wants to compute some function P ( x , y , z ). m 1 m 2 out x y z ◮ To help Charlie, Alice sends a message m 1 to Bob, and then Bob sends a message m 2 to Charlie. ◮ Question: How large must the total length of the messages be for Charlie to evaluate P ( x , y , z ) correctly? 2/23

  5. Basic Communication Complexity ◮ Three friends Alice, Bob, and Charlie each have some information x , y , z and Charlie wants to compute some function P ( x , y , z ). m 1 m 2 out x y z ◮ To help Charlie, Alice sends a message m 1 to Bob, and then Bob sends a message m 2 to Charlie. ◮ Question: How large must the total length of the messages be for Charlie to evaluate P ( x , y , z ) correctly? ◮ Deterministic: m 1 ( x ), m 2 ( m 1 , y ), out( m 2 , z ) = P ( x , y , z ) 2/23

  6. Basic Communication Complexity ◮ Three friends Alice, Bob, and Charlie each have some information x , y , z and Charlie wants to compute some function P ( x , y , z ). m 1 m 2 out x y z ◮ To help Charlie, Alice sends a message m 1 to Bob, and then Bob sends a message m 2 to Charlie. ◮ Question: How large must the total length of the messages be for Charlie to evaluate P ( x , y , z ) correctly? ◮ Deterministic: m 1 ( x ), m 2 ( m 1 , y ), out( m 2 , z ) = P ( x , y , z ) ◮ Random: m 1 ( x , r ), m 2 ( m 1 , y , r ), out( m 2 , z , r ) where r is public random string. Require P r [out( m 2 , z , r ) = P ( x , y , z )] ≥ 9 / 10. 2/23

  7. Stream Algorithms Yield Communication Protocols 3/23

  8. Stream Algorithms Yield Communication Protocols ◮ Let Q be some stream problem. Suppose there’s a reduction x → S 1 , y → S 2 , z → S 3 such that knowing Q ( S 1 ◦ S 2 ◦ S 3 ) solves P ( x , y , z ). m 1 m 2 out x y z S 1 S 2 S 3 3/23

  9. Stream Algorithms Yield Communication Protocols ◮ Let Q be some stream problem. Suppose there’s a reduction x → S 1 , y → S 2 , z → S 3 such that knowing Q ( S 1 ◦ S 2 ◦ S 3 ) solves P ( x , y , z ). m 1 m 2 out x y z S 1 S 2 S 3 ◮ An s -bit stream algorithm A for Q yields 2 s -bit protocol for P : 3/23

  10. Stream Algorithms Yield Communication Protocols ◮ Let Q be some stream problem. Suppose there’s a reduction x → S 1 , y → S 2 , z → S 3 such that knowing Q ( S 1 ◦ S 2 ◦ S 3 ) solves P ( x , y , z ). m 1 m 2 out x y z S 1 S 2 S 3 ◮ An s -bit stream algorithm A for Q yields 2 s -bit protocol for P : Alice runs A of S 1 ; 3/23

  11. Stream Algorithms Yield Communication Protocols ◮ Let Q be some stream problem. Suppose there’s a reduction x → S 1 , y → S 2 , z → S 3 such that knowing Q ( S 1 ◦ S 2 ◦ S 3 ) solves P ( x , y , z ). m 1 m 2 out x y z S 1 S 2 S 3 ◮ An s -bit stream algorithm A for Q yields 2 s -bit protocol for P : Alice runs A of S 1 ; sends memory state to Bob; 3/23

  12. Stream Algorithms Yield Communication Protocols ◮ Let Q be some stream problem. Suppose there’s a reduction x → S 1 , y → S 2 , z → S 3 such that knowing Q ( S 1 ◦ S 2 ◦ S 3 ) solves P ( x , y , z ). m 1 m 2 out x y z S 1 S 2 S 3 ◮ An s -bit stream algorithm A for Q yields 2 s -bit protocol for P : Alice runs A of S 1 ; sends memory state to Bob; Bob instantiates A with state and runs it on S 2 ; 3/23

  13. Stream Algorithms Yield Communication Protocols ◮ Let Q be some stream problem. Suppose there’s a reduction x → S 1 , y → S 2 , z → S 3 such that knowing Q ( S 1 ◦ S 2 ◦ S 3 ) solves P ( x , y , z ). m 1 m 2 out x y z S 1 S 2 S 3 ◮ An s -bit stream algorithm A for Q yields 2 s -bit protocol for P : Alice runs A of S 1 ; sends memory state to Bob; Bob instantiates A with state and runs it on S 2 ; sends state to Charlie who finishes running A on S 3 and infers P ( x , y , z ) from Q ( S 1 ◦ S 2 ◦ S 3 ). 3/23

  14. Communication Lower Bounds imply Stream Lower Bounds ◮ Had there been t players, the s -bit stream algorithm for Q would have lead to a ( t − 1) s bit protocol P . 4/23

  15. Communication Lower Bounds imply Stream Lower Bounds ◮ Had there been t players, the s -bit stream algorithm for Q would have lead to a ( t − 1) s bit protocol P . ◮ Hence, a lower bound of L on the communication required for P implies s ≥ L / ( t − 1) bits of space are required to solve Q . 4/23

  16. Outline of Lecture Classic Problems and Reductions Information Statistics Approach Hamming Approximation 5/23

  17. Outline Classic Problems and Reductions Information Statistics Approach Hamming Approximation 6/23

  18. Indexing ◮ Consider a binary string x ∈ { 0 , 1 } n and j ∈ [ n ], e.g., � � x = 0 1 0 1 1 0 and j = 3 and define Index ( x , j ) = x j 7/23

  19. Indexing ◮ Consider a binary string x ∈ { 0 , 1 } n and j ∈ [ n ], e.g., � � x = 0 1 0 1 1 0 and j = 3 and define Index ( x , j ) = x j ◮ Suppose Alice knows x and Bob knows j . 7/23

  20. Indexing ◮ Consider a binary string x ∈ { 0 , 1 } n and j ∈ [ n ], e.g., � � x = 0 1 0 1 1 0 and j = 3 and define Index ( x , j ) = x j ◮ Suppose Alice knows x and Bob knows j . ◮ How many bits need to be sent by Alice for Bob to determine Index ( x , j ) with probability 9/10? 7/23

  21. Indexing ◮ Consider a binary string x ∈ { 0 , 1 } n and j ∈ [ n ], e.g., � � x = 0 1 0 1 1 0 and j = 3 and define Index ( x , j ) = x j ◮ Suppose Alice knows x and Bob knows j . ◮ How many bits need to be sent by Alice for Bob to determine Index ( x , j ) with probability 9/10? Ω( n ) 7/23

  22. Application: Median Finding ◮ Thm: Any algorithm that returns the exact median of length 2 n − 1 stream requires Ω( n ) memory. 8/23

  23. Application: Median Finding ◮ Thm: Any algorithm that returns the exact median of length 2 n − 1 stream requires Ω( n ) memory. ◮ Reduction from Index: On input x ∈ { 0 , 1 } n , Alice generates S 1 = { 2 i + x i : i ∈ [ n ] } . On input j ∈ [ n ], Bob generates S 2 = { n − j copies of 0 and j − 1 copies of 2 n + 2 } . E.g., � � x = 0 1 0 1 1 0 → { 2 , 5 , 6 , 9 , 11 , 12 } j = 3 → { 0 , 0 , 0 , 14 , 14 } 8/23

  24. Application: Median Finding ◮ Thm: Any algorithm that returns the exact median of length 2 n − 1 stream requires Ω( n ) memory. ◮ Reduction from Index: On input x ∈ { 0 , 1 } n , Alice generates S 1 = { 2 i + x i : i ∈ [ n ] } . On input j ∈ [ n ], Bob generates S 2 = { n − j copies of 0 and j − 1 copies of 2 n + 2 } . E.g., � � x = 0 1 0 1 1 0 → { 2 , 5 , 6 , 9 , 11 , 12 } j = 3 → { 0 , 0 , 0 , 14 , 14 } ◮ Then median( S 1 ∪ S 2 ) = 2 j + x j and this determines Index ( x , j ). 8/23

  25. Application: Median Finding ◮ Thm: Any algorithm that returns the exact median of length 2 n − 1 stream requires Ω( n ) memory. ◮ Reduction from Index: On input x ∈ { 0 , 1 } n , Alice generates S 1 = { 2 i + x i : i ∈ [ n ] } . On input j ∈ [ n ], Bob generates S 2 = { n − j copies of 0 and j − 1 copies of 2 n + 2 } . E.g., � � x = 0 1 0 1 1 0 → { 2 , 5 , 6 , 9 , 11 , 12 } j = 3 → { 0 , 0 , 0 , 14 , 14 } ◮ Then median( S 1 ∪ S 2 ) = 2 j + x j and this determines Index ( x , j ). ◮ An s -space algorithm implies an s -bit protocol so s = Ω( n ) by the communication complexity of indexing. 8/23

  26. Multi-Party Set-Disjointness ◮ Consider a t × n matrix where column has weight 0 , 1 , or t , e.g.,   0 0 0 1 0 0 1 0 0 1 1 0   C =   0 1 0 1 0 0   0 0 0 1 0 0 and let Disj t ( C ) = 1 if there is an all 1’s column and 0 otherwise. 9/23

  27. Multi-Party Set-Disjointness ◮ Consider a t × n matrix where column has weight 0 , 1 , or t , e.g.,   0 0 0 1 0 0 1 0 0 1 1 0   C =   0 1 0 1 0 0   0 0 0 1 0 0 and let Disj t ( C ) = 1 if there is an all 1’s column and 0 otherwise. ◮ Consider t players where P i knows i -th row of C . 9/23

  28. Multi-Party Set-Disjointness ◮ Consider a t × n matrix where column has weight 0 , 1 , or t , e.g.,   0 0 0 1 0 0 1 0 0 1 1 0   C =   0 1 0 1 0 0   0 0 0 1 0 0 and let Disj t ( C ) = 1 if there is an all 1’s column and 0 otherwise. ◮ Consider t players where P i knows i -th row of C . ◮ How many bits need to be communicated between the players to determine Disj t ( C )? 9/23

  29. Multi-Party Set-Disjointness ◮ Consider a t × n matrix where column has weight 0 , 1 , or t , e.g.,   0 0 0 1 0 0 1 0 0 1 1 0   C =   0 1 0 1 0 0   0 0 0 1 0 0 and let Disj t ( C ) = 1 if there is an all 1’s column and 0 otherwise. ◮ Consider t players where P i knows i -th row of C . ◮ How many bits need to be communicated between the players to determine Disj t ( C )? Ω( n / t ) 9/23

Recommend


More recommend