Amit Chakrabarti Dartmouth College Main result joint with Oded - PowerPoint PPT Presentation
Optimal Lower Bound for GHD May 2011 Gap-Hamming-Distance: The Journey to an Optimal Lower Bound Amit Chakrabarti Dartmouth College Main result joint with Oded Regev, Tel Aviv University Sublinear Algorithms Workshop at Bertinoro, May 2011
Optimal Lower Bound for GHD May 2011 Gap-Hamming-Distance: The Journey to an Optimal Lower Bound Amit Chakrabarti Dartmouth College Main result joint with Oded Regev, Tel Aviv University Sublinear Algorithms Workshop at Bertinoro, May 2011 Amit Chakrabarti 1
Optimal Lower Bound for GHD May 2011 The Gap-Hamming-Distance Problem Input: Alice gets x ∈ { 0 , 1 } n , Bob gets y ∈ { 0 , 1 } n . Output: 2 + √ n • ghd ( x, y ) = 1 if ∆( x, y ) > n 2 − √ n • ghd ( x, y ) = 0 if ∆( x, y ) < n Want: randomized, constant error protocol Cost: Worst case number of bits communicated 0 1 0 0 1 0 1 1 0 0 0 1 x = y = 0 0 0 0 0 0 1 1 1 0 0 1 √ √ n = 12; ∆( x, y ) = 3 ∈ [6 − 12 , 6 + 12] Amit Chakrabarti 2
Optimal Lower Bound for GHD May 2011 Implications Data stream lower bounds • Distinct elements • Frequency moments • Norms • Entropy • General form of bound: ps = Ω(1 /ε 2 ) Distributed functional monitoring lower bounds Connections to differential privacy Amit Chakrabarti 3
Optimal Lower Bound for GHD May 2011 The Reductions E.g., Distinct Elements (Other problems: similar) 0 1 0 0 1 0 1 1 0 0 0 1 x = ) ) ) 0 0 1 ) ) ) ) ) ) ) ) ) 0 1 0 0 1 0 1 1 0 , , , σ : 0 1 2 , , , , , , , , , 1 2 3 4 5 6 7 8 9 1 1 1 ( ( ( ( ( ( ( ( ( ( ( ( y = 0 0 0 0 0 0 1 1 1 0 0 1 ) ) ) 0 0 1 ) ) ) ) ) ) ) ) ) 0 0 0 0 0 0 1 1 1 , , , τ : 0 1 2 , , , , , , , , , 1 2 3 4 5 6 7 8 9 1 1 1 ( ( ( ( ( ( ( ( ( ( ( ( Alice: x �− → σ = � (1 , x 1 ) , (2 , x 2 ) , . . . , ( n, x n ) � Bob: y �− → τ = � (1 , y 1 ) , (2 , y 2 ) , . . . , ( n, y n ) � 2 − √ n, or < 3 n 1 Notice: F 0 ( σ ◦ τ ) = n + ∆( x, y ) = Set ε = √ n . 2 + √ n. > 3 n Amit Chakrabarti 4
Optimal Lower Bound for GHD May 2011 Ancient History Amit Chakrabarti 5
Optimal Lower Bound for GHD May 2011 One-Pass Bounds Indyk, Woodruff [FOCS 2003] • Considered one-pass lower bound for dist-elem • Recognized relevance of ghd , difficulty of lower-bounding • Defined “related” problem Π ℓ 2 , showed R → (Π ℓ 2 ) = Ω( n ) • Concluded Ω( ε − 2 ) bound for dist-elem m,ε with m = � Ω(1 /ε 9 ) Amit Chakrabarti 6
Optimal Lower Bound for GHD May 2011 One-Pass Bounds Indyk, Woodruff [FOCS 2003] • Considered one-pass lower bound for dist-elem • Recognized relevance of ghd , difficulty of lower-bounding • Defined “related” problem Π ℓ 2 , showed R → (Π ℓ 2 ) = Ω( n ) • Concluded Ω( ε − 2 ) bound for dist-elem m,ε with m = � Ω(1 /ε 9 ) Woodruff [SODA 2004] • Worked with ghd itself, showed R → ( ghd ) = Ω( n ) • Very intricate combinatorial proof, with hairy probability estimations • Conjectured R( ghd ) = Ω( n ) , implying multi-pass lower bounds Amit Chakrabarti 6-a
Optimal Lower Bound for GHD May 2011 The VC-Dimension Technique • Consider communication matrix of ghd as set system • The system has Ω( n ) VC-dimension 1 0 1 0 0 0 1 0 0 1 1 1 0 1 1 1 1 1 1 1 1 1 1 1 0 1 0 0 1 0 1 0 0 0 0 0 1 0 1 0 1 1 • Thus, R → ( ghd ) = Ω( n ) Amit Chakrabarti 7
Optimal Lower Bound for GHD May 2011 The VC-Dimension Technique • Consider communication matrix of ghd as set system • The system has Ω( n ) VC-dimension 1 0 1 0 0 0 0 1 0 0 0 1 1 1 1 0 1 1 1 1 1 1 1 1 1 1 1 1 0 0 1 0 0 1 0 1 0 0 0 0 0 1 1 0 1 1 0 1 1 Instance of INDEX Amit Chakrabarti 7
Optimal Lower Bound for GHD May 2011 The VC-Dimension Technique • Consider communication matrix of ghd as set system • The system has Ω( n ) VC-dimension 1 0 1 0 0 0 0 1 0 0 0 1 1 1 1 0 1 1 1 1 1 1 1 1 1 1 1 1 0 0 1 0 0 1 0 1 0 0 0 0 0 1 1 0 1 1 0 1 1 Instance of INDEX • Thus, R → ( ghd ) = Ω( n ) Amit Chakrabarti 7-a
Optimal Lower Bound for GHD May 2011 The Middle Ages Amit Chakrabarti 8
Optimal Lower Bound for GHD May 2011 A Nice Simplification Jayram, Kumar, Sivakumar [circa 2005] • Simpler proof of R → ( ghd ) = Ω( n ) • Much simpler: direct reduction from index • Geometric intuition: � � n √ n , − 1 1 x ∈ { 0 , 1 } n ∈ R n Alice: �− → x ∈ � √ n e j = (0 , . . . , 0 , 1 , 0 , . . . , 0) ∈ R n Bob: j ∈ [ n ] �− → • Observe: � � x, e j � �≈ 0 , and x j determined by sgn � � x, e j � • We’ve reduced index to “gap-inner-product”, or gip Amit Chakrabarti 9
Optimal Lower Bound for GHD May 2011 Inner Product ↔ Hamming Distance • Obviously, ghd → gip : y � = 1 − 2∆( x, y ) � � x, � n 2 ± √ n y � ≷ ∓ 2 √ n ⇒ ∆( x, y ) ≶ n � � x, � • Also, gip → ghd by “discretization transform”: Pick random Gaussians r 1 , . . . , r N , with N = 10 n x ∈ R n x, r N � ) ∈ {± 1 } N Alice: ¯ �− → x = (sgn � ¯ x, r 1 � , . . . , sgn � ¯ y ∈ R n y, r N � ) ∈ {± 1 } N Bob: ¯ �− → y = (sgn � ¯ y, r 1 � , . . . , sgn � ¯ √ whp y � ≷ ∓ 1 ∆( x, y ) ≶ N � ¯ x, ¯ = ⇒ 2 ± O ( N ) √ n Amit Chakrabarti 10
Optimal Lower Bound for GHD May 2011 The Renaissance Era Amit Chakrabarti 11
Optimal Lower Bound for GHD May 2011 Round Elimination Brody, Chakrabarti [CCC 2009] • Can we at least rule out a two-pass improvement for dist-elem ? • A cheap first message makes little progress? Then rinse, repeat • Tends to decimate problem [Miltersen-Nisan-Safra-Wigderson’98] [Sen’03] Input: Input: ⇒ (k rounds) (k−1 rounds) Padding: Amit Chakrabarti 12
Optimal Lower Bound for GHD May 2011 Another VC-Dimension Argument: Subcube Lifting First message constant on large set: 1 } 1 0 1 1 0 0 1 0 0 1 1 1 0 1 0.99 n 1 1 1 1 1 1 1 2 points 1 1 1 0 1 0 0 1 0 1 0 0 0 0 0 1 0 0 1 1 Alice, Bob lift their ( n/ 3) -dim inputs from inner coords to full n -dim space First message now redundant, so eliminate! Amit Chakrabarti 13
Optimal Lower Bound for GHD May 2011 Another VC-Dimension Argument: Subcube Lifting First message constant on large set: 1 } 1 0 0 1 1 1 0 0 1 0 0 0 1 1 1 1 0 1 0.99 n 1 1 1 1 1 1 1 1 1 2 points 1 1 1 1 0 0 1 0 0 1 0 0 1 0 0 0 0 0 0 1 1 0 1 0 1 1 S: inner coords, the real input (Rest: outer coords, padding) Alice, Bob lift their ( n/ 3) -dim inputs from inner coords to full n -dim space First message now redundant, so eliminate! Amit Chakrabarti 13
Optimal Lower Bound for GHD May 2011 Another VC-Dimension Argument: Subcube Lifting First message constant on large set: � � � � � � � � � � � � � � � � � � � � ���� � � � � � � � � � � � ������ � � � � � � � � � � � � � � � � � � � � � � � � � � �� ���������������������������� ����������������������������� Amit Chakrabarti 13
Optimal Lower Bound for GHD May 2011 Another VC-Dimension Argument: Subcube Lifting First message constant on large set: � � � � � � � � � � � � � � � � � � � � ���� � � � � � � � � � � � ������ � � � � � � � � � � � � � � � � � � � � � � � � � � �� ���������������������������� ����������������������������� Alice, Bob lift their ( n/ 3) -dim inputs from inner coords to full n -dim space First message now redundant, so eliminate! [Brody-C.’09] Amit Chakrabarti 13-a
Optimal Lower Bound for GHD May 2011 Better Round Elimination Brody, Chakrabarti, Regev, Vidick, de Wolf [RANDOM 2010] • Previous argument reduced dimension too rapidly • Gives R k ( ghd ) = n/ 2 O ( k 2 ) • Can improve to R k ( ghd ) = n/O ( k 2 ) Amit Chakrabarti 14
Optimal Lower Bound for GHD May 2011 Round Elimination V2.0: Geometric Perturbation First message constant over large set A {0,1} n x z A ERR y 1/2 n c Amit Chakrabarti 15
Optimal Lower Bound for GHD May 2011 Round Elimination V2.0: Geometric Perturbation First message constant over large set A {0,1} n x z A ERR y 1/2 n c Alice: replace x with z = NearestNeighbour ( x, A ) Amit Chakrabarti 15-a
Optimal Lower Bound for GHD May 2011 Modern History Amit Chakrabarti 16
Optimal Lower Bound for GHD May 2011 Main Theorem Chakrabarti, Regev [STOC 2011] And now, we show: R( ghd ) = Ω( n ) Amit Chakrabarti 17
Optimal Lower Bound for GHD May 2011 The Rectangle Property Input universe U = { 0 , 1 } n × { 0 , 1 } n Deterministic protocol P , communicating ≤ c bits partitions U into ≤ 2 c rectangles A i × B i , where A i , B i ⊆ { 0 , 1 } n Bob Alice If P computes f : U → { 0 , 1 } , then f − 1 (1) = R 1 ∪ R 2 ∪ · · · ∪ R 2 c Amit Chakrabarti 18
Recommend
More recommend
Explore More Topics
Stay informed with curated content and fresh updates.