lower bounds for number in hand multiparty communication
play

Lower Bounds for Number-in-Hand Multiparty Communication Complexity - PowerPoint PPT Presentation

Lower Bounds for Number-in-Hand Multiparty Communication Complexity Jeff M. Phillips Elad Verbin, Qin Zhang Univ. of Utah CTIC/MADALGO, Aarhus Univ. SODA 2012, Kyoto Jan. 17, 2012 1-1 The multiparty communication model x 1 = 010011 x 2 =


  1. Lower Bounds for Number-in-Hand Multiparty Communication Complexity Jeff M. Phillips Elad Verbin, Qin Zhang Univ. of Utah CTIC/MADALGO, Aarhus Univ. SODA 2012, Kyoto Jan. 17, 2012 1-1

  2. The multiparty communication model x 1 = 010011 x 2 = 111011 x 3 = 111111 x k = 100011 2-1

  3. The multiparty communication model x 1 = 010011 x 2 = 111011 x 3 = 111111 x k = 100011 We want to compute f ( x 1 , x 2 , . . . , x k ) f can be bit-wise XOR, OR, AND, MAJ . . . 2-2

  4. The multiparty communication model Blackboard: One speaks, everyone else hears. x 1 = 010011 x 2 = 111011 Message passing: If x 1 talks to x 2 , others can- not hear. Today’s focus x 3 = 111111 x k = 100011 We want to compute f ( x 1 , x 2 , . . . , x k ) f can be bit-wise XOR, OR, AND, MAJ . . . 2-3

  5. Related work So natural, must be studied? 3-1

  6. Related work So natural, must be studied? The Blackboard model: Quite a few works. The Message-passing model: Almost nothing. 3-2

  7. Related work So natural, must be studied? The Blackboard model: Quite a few works. The Message-passing model: Almost nothing. Back to the “ancient” time: “lower bounds on the multiparty communication complexity” by Duris and Rolim ’98. Gives some deterministic lower bounds. 3-3

  8. Related work So natural, must be studied? The Blackboard model: Quite a few works. The Message-passing model: Almost nothing. Back to the “ancient” time: “lower bounds on the multiparty communication complexity” by Duris and Rolim ’98. Gives some deterministic lower bounds. Gal and Gopalan for “longest increasing sequence” , ’07. and Guha and Huang for “ random order streams” , ’09. Under “private message model” but it is different from ours. 3-4

  9. Our results 1. Ω( nk ) for the k -bitwise-XOR/OR/AND/MAJ. 2. Ω( n log k ) for k -bitwise-AND/OR in the black- board model. 3. ˜ Ω( nk ) for k -connectivity. All tight, and for randomized algorithms. 4-1

  10. Our results 1. Ω( nk ) for the k -bitwise-XOR/OR/AND/MAJ. 2. Ω( n log k ) for k -bitwise-AND/OR in the black- board model. 3. ˜ Ω( nk ) for k -connectivity. All tight, and for randomized algorithms. Artificial? Well, some interesting problems can be reduced to these (later). 4-2

  11. Warm up – k -bitwise-XOR 1 0 0 XOR A 1 , 1 A 1 , 2 A 1 ,n S 1 S 2 A 2 , 1 A 2 , 2 A 2 ,n S k A k 2 , 1 A k A k 2 , 2 2 ,n 2 S k A k, 1 A k, 2 A k,n 5-1

  12. 2 -XOR ⇒ k -XOR x 1 x 2 x 5 x 3 x 4 6-1

  13. 2 -XOR ⇒ k -XOR x 1 x 2 Pick a random guy, say x 4 . x 5 Total CC is C ⇒ the expected CC ( x 4 : others) x 3 is at most 2 C/k . x 4 6-2

  14. 2 -XOR ⇒ k -XOR x 1 x 2 Pick a random guy, say x 4 . x 5 Total CC is C ⇒ the expected CC ( x 4 : others) x 3 is at most 2 C/k . x 4 Alice and Bob want to solve the 2 -XOR (the inputs are randomly from { 0 , 1 } n ) ⇒ running a protocol for k -XOR as follows: 6-3

  15. 2 -XOR ⇒ k -XOR x 1 Bob x 2 Pick a random guy, say x 4 . x 5 Total CC is C ⇒ the expected CC ( x 4 : others) x 3 is at most 2 C/k . x 4 Alice Alice and Bob want to solve the 2 -XOR (the inputs are randomly from { 0 , 1 } n ) ⇒ running a protocol for k -XOR as follows: Alice plays a random guy with her input. Bob plays another random guy with his input. He also plays the other k − 2 guys with random inputs from { 0 , 1 } n . 6-4

  16. 2 -XOR ⇒ k -XOR x 1 Bob x 2 Pick a random guy, say x 4 . x 5 Total CC is C ⇒ the expected CC ( x 4 : others) x 3 is at most 2 C/k . x 4 Alice Alice and Bob want to solve the 2 -XOR (the inputs are randomly from { 0 , 1 } n ) ⇒ running a protocol for k -XOR as follows: Alice plays a random guy with her input. Bob plays another random guy with his input. He also plays the other k − 2 guys with random inputs from { 0 , 1 } n . Note: inputs of all k -players are symmetric. 6-5

  17. 2 -XOR ⇒ k -XOR x 1 Bob x 2 E [CC( 2 -XOR)] ≤ 2 k CC( k -XOR) Ω( n ) Ω( nk ) Pick a random guy, say x 4 . x 5 Total CC is C ⇒ the expected CC ( x 4 : others) x 3 is at most 2 C/k . x 4 Alice Alice and Bob want to solve the 2 -XOR (the inputs are randomly from { 0 , 1 } n ) ⇒ running a protocol for k -XOR as follows: Alice plays a random guy with her input. Bob plays another random guy with his input. He also plays the other k − 2 guys with random inputs from { 0 , 1 } n . Note: inputs of all k -players are symmetric. 6-6

  18. k -bitwise-OR 1 1 0 OR A 1 , 1 A 1 , 2 A 1 ,n S 1 S 2 A 2 , 1 A 2 , 2 A 2 ,n S k A k 2 , 1 A k A k 2 , 2 2 ,n 2 S k A k, 1 A k, 2 A k,n 7-1

  19. Intuition to reduce from 2 -DISJ As always, first, try to find the hard distance for k -OR! First attempt: each coordinate is 1 w.p. 1 /k . 8-1

  20. Intuition to reduce from 2 -DISJ As always, first, try to find the hard distance for k -OR! First attempt: each coordinate is 1 w.p. 1 /k . Hard for k = 2 but not for general k . 8-2

  21. Intuition to reduce from 2 -DISJ As always, first, try to find the hard distance for k -OR! First attempt: each coordinate is 1 w.p. 1 /k . Hard for k = 2 but not for general k . Second attempt: random partition n coordinates to two equal-sized sets. important set : each entry is 1 w.p. 1 /k . balancing set : all entries are 1. 8-3

  22. Intuition to reduce from 2 -DISJ As always, first, try to find the hard distance for k -OR! First attempt: each coordinate is 1 w.p. 1 /k . Hard for k = 2 but not for general k . Second attempt: random partition n coordinates to two equal-sized sets. important set : each entry is 1 w.p. 1 /k . balancing set : all entries are 1. Seems hard but, wait! The Slepian-Wolf coding. 8-4

  23. Intuition to reduce from 2 -DISJ As always, first, try to find the hard distance for k -OR! First attempt: each coordinate is 1 w.p. 1 /k . Hard for k = 2 but not for general k . Second attempt: random partition n coordinates to two equal-sized sets. important set : each entry is 1 w.p. 1 /k . balancing set : all entries are 1. Seems hard but, wait! The Slepian-Wolf coding. Third attempt: same as the second. Except the balancing set: each entry is 1 w.p. 1/2. 8-5

  24. Intuition to reduce from 2 -DISJ As always, first, try to find the hard distance for k -OR! First attempt: each coordinate is 1 w.p. 1 /k . Hard for k = 2 but not for general k . Second attempt: random partition n coordinates to two equal-sized sets. important set : each entry is 1 w.p. 1 /k . balancing set : all entries are 1. Seems hard but, wait! The Slepian-Wolf coding. Third attempt: same as the second. Except the balancing set: each entry is 1 w.p. 1/2. It works! Now Alice takes one vector. Bob takes the other k − 1 vectors and OR them together, and then takes the complement. Looks like 2 -DISJ. 8-6

  25. 2 -DISJ ⇒ k -OR x 1 x 2 Pick a random guy, say x 4 . x 5 Total CC is C ⇒ the expected CC ( x 4 : others) x 3 is at most 2 C/k . x 4 2 -DISJ: Alice has x ∈ [ n ] and Bob has y ∈ [ n ] . W.p. 1 / 4 , x and y are random subsets of [ n ] of size n/ 4 and | x ∩ y | = 1 . And w.p. 1 − 1 / 4 , x and y are random subsets of [ n ] of size n/ 4 and x ∩ y = ∅ . 9-1

  26. 2 -DISJ ⇒ k -OR x 1 Bob x 2 Pick a random guy, say x 4 . x 5 Total CC is C ⇒ the expected CC ( x 4 : others) x 3 is at most 2 C/k . x 4 Alice 2 -DISJ: Alice has x ∈ [ n ] and Bob has y ∈ [ n ] . W.p. 1 / 4 , x and y are random subsets of [ n ] of size n/ 4 and | x ∩ y | = 1 . And w.p. 1 − 1 / 4 , x and y are random subsets of [ n ] of size n/ 4 and x ∩ y = ∅ . Alice plays a random guy with her input x . Bob plays the other k − 1 guys with his input y . 9-2

  27. 2 -DISJ ⇒ k -OR x 1 Bob x 2 Pick a random guy, say x 4 . x 5 Total CC is C ⇒ the expected CC ( x 4 : others) x 3 is at most 2 C/k . x 4 Alice 2 -DISJ: Alice has x ∈ [ n ] and Bob has y ∈ [ n ] . W.p. 1 / 4 , x and y are random subsets of [ n ] of size n/ 4 and | x ∩ y | = 1 . And w.p. 1 − 1 / 4 , x and y are random subsets of [ n ] of size n/ 4 and x ∩ y = ∅ . Alice plays a random guy with her input x . Bob plays the other k − 1 guys with his input y . Again: inputs of all k -players are symmetric. 9-3

  28. 2 -DISJ ⇒ k -OR x 1 Bob x 2 E [CC( 2 -DISJ)] ≤ 2 k CC( k -OR) Razborov[90]: Ω( n ) . Ω( nk ) Pick a random guy, say x 4 . x 5 Total CC is C ⇒ the expected CC ( x 4 : others) x 3 is at most 2 C/k . x 4 Alice 2 -DISJ: Alice has x ∈ [ n ] and Bob has y ∈ [ n ] . W.p. 1 / 4 , x and y are random subsets of [ n ] of size n/ 4 and | x ∩ y | = 1 . And w.p. 1 − 1 / 4 , x and y are random subsets of [ n ] of size n/ 4 and x ∩ y = ∅ . Alice plays a random guy with her input x . Bob plays the other k − 1 guys with his input y . Again: inputs of all k -players are symmetric. 9-4

  29. Summary of other reults 1. Ω( nk ) for the MAJ. 2. Ω( n log k ) for AND and OR in the blackboard model. 3. ˜ Ω( nk ) for k -connectivity. (one of main technical contributions) 4. Some direct sum results. 5. Some applications, e.g. the heavy hitter problem and the ǫ -kernels in the site-server model (next page). 10-1

  30. Motivation coordinator C · · · sites S 2 S 1 S 3 S k time The Distributed Streaming Model 11-1

Recommend


More recommend