On the Log Rank Conjecture Thomas Rothvoss UW Seattle Current Topics Seminar
Communication complexity Setting: ◮ Function f : X × Y → { 0 , 1 } Alice Bob
Communication complexity Setting: ◮ Function f : X × Y → { 0 , 1 } ◮ Players Alice and Bob agree apriori on a deterministic communication protocoll Alice Bob
Communication complexity Setting: ◮ Function f : X × Y → { 0 , 1 } ◮ Players Alice and Bob agree apriori on a deterministic communication protocoll ◮ Alice receives x ∈ X , Bob receives y ∈ X Alice Bob y ∈ Y x ∈ X
Communication complexity Setting: ◮ Function f : X × Y → { 0 , 1 } ◮ Players Alice and Bob agree apriori on a deterministic communication protocoll ◮ Alice receives x ∈ X , Bob receives y ∈ X ◮ They exchange messages to compute f ( x, y ) Alice Bob y ∈ Y x ∈ X 1
Communication complexity Setting: ◮ Function f : X × Y → { 0 , 1 } ◮ Players Alice and Bob agree apriori on a deterministic communication protocoll ◮ Alice receives x ∈ X , Bob receives y ∈ X ◮ They exchange messages to compute f ( x, y ) Alice Bob y ∈ Y x ∈ X 1 1 , 0
Communication complexity Setting: ◮ Function f : X × Y → { 0 , 1 } ◮ Players Alice and Bob agree apriori on a deterministic communication protocoll ◮ Alice receives x ∈ X , Bob receives y ∈ X ◮ They exchange messages to compute f ( x, y ) Alice Bob y ∈ Y x ∈ X 1 1 , 0 0
Communication complexity Setting: ◮ Function f : X × Y → { 0 , 1 } ◮ Players Alice and Bob agree apriori on a deterministic communication protocoll ◮ Alice receives x ∈ X , Bob receives y ∈ X ◮ They exchange messages to compute f ( x, y ) Alice Bob y ∈ Y x ∈ X 1 1 , 0 0 1 both know f ( x, y )
Communication complexity Setting: ◮ Function f : X × Y → { 0 , 1 } ◮ Players Alice and Bob agree apriori on a deterministic communication protocoll ◮ Alice receives x ∈ X , Bob receives y ∈ X ◮ They exchange messages to compute f ( x, y ) Alice Bob y ∈ Y x ∈ X 1 1 , 0 0 1 both know f ( x, y ) CC ( f ) = protocoll max min x ∈ X,y ∈ Y { bits to compute f ( x, y ) }
Communication complexity (2) Example: ◮ Input for Alice: x ∈ { 0 , 1 } n ◮ Input for Bob: y ∈ { 0 , 1 } n f ( x, y ) = x 1 + . . . + x n + y 1 + . . . + y n mod 2
Communication complexity (2) Example: ◮ Input for Alice: x ∈ { 0 , 1 } n ◮ Input for Bob: y ∈ { 0 , 1 } n f ( x, y ) = x 1 + . . . + x n + y 1 + . . . + y n mod 2 A 1-bit protocoll: (1) Alice send x 1 + . . . + x n mod 2 to Bob. (2) Bob then knows the answer.
Communication complexity (2) Example: ◮ Input for Alice: x ∈ { 0 , 1 } n ◮ Input for Bob: y ∈ { 0 , 1 } n f ( x, y ) = x 1 + . . . + x n + y 1 + . . . + y n mod 2 odd y even y 0 0 0 0 1 1 1 1 A 1-bit protocoll: 0 0 0 0 1 1 1 1 odd x (1) Alice send x 1 + . . . + x n 0 0 0 0 1 1 1 1 mod 2 to Bob. 0 0 0 0 1 1 1 1 1 1 1 1 0 0 0 0 (2) Bob then knows the answer. 1 1 1 1 0 0 0 0 even x 1 1 1 1 0 0 0 0 1 1 1 1 0 0 0 0
Communication complexity (2) Example: ◮ Function � 1 x = y EQ : { 0 , 1 } n ×{ 0 , 1 } n → { 0 , 1 } EQ ( x, y ) = 0 otherwise
Communication complexity (2) Example: ◮ Function � 1 x = y EQ : { 0 , 1 } n ×{ 0 , 1 } n → { 0 , 1 } EQ ( x, y ) = 0 otherwise ◮ Complexity theory 101: CC ( EQ ) = n
Communication complexity (2) Example: ◮ Function � 1 x = y EQ : { 0 , 1 } n ×{ 0 , 1 } n → { 0 , 1 } EQ ( x, y ) = 0 otherwise ◮ Complexity theory 101: CC ( EQ ) = n Y 0 1 0 0 0 0 1 0 0 0 X 0 0 0 1 0 0 0 0 0 1
Communication complexity (3) 1 1 1 0 0 0 0 0 0 0 1 1 decision tree: 1 1 1 0 0 1 1 1 0 0 1 1 1 0 0 0 1 0 1
Communication complexity (3) 1 1 1 0 0 0 0 0 0 0 1 1 decision tree: 1 1 1 0 0 1 1 1 0 0 1 1 1 0 0 0 1 0 1 1 1 1 0 0 0 0 0 1 0 1 1 1 0 0 1 1 1 0 0 1 1 1 0 0
Communication complexity (3) 1 1 1 0 0 0 0 0 0 0 1 1 decision tree: 1 1 1 0 0 1 1 1 0 0 1 1 1 0 0 0 1 0 1 v 1 1 1 0 0 0 0 0 1 0 1 1 1 0 0 1 1 1 0 0 1 1 1 0 0 Observations: ◮ For a leave v of tree, R v := { ( x, y ) : protocoll ends in v } is a monochromatic rectangle
Communication complexity (3) 1 1 1 0 0 0 0 0 0 0 1 1 decision tree: 1 1 1 0 0 1 1 1 0 0 1 1 1 0 0 0 1 0 1 v 1 1 1 0 0 0 0 0 1 0 1 1 1 0 0 1 1 1 0 0 1 1 1 0 0 Observations: ◮ For a leave v of tree, R v := { ( x, y ) : protocoll ends in v } is a monochromatic rectangle ◮ A protocol exchanging k bits ⇒ rank( f ) ≤ 2 k
Communication complexity (3) 1 1 1 0 0 0 0 0 0 0 1 1 decision tree: 1 1 1 0 0 1 1 1 0 0 1 1 1 0 0 0 1 0 1 v 1 1 1 0 0 0 0 0 1 0 1 1 1 0 0 1 1 1 0 0 1 1 1 0 0 Observations: ◮ For a leave v of tree, R v := { ( x, y ) : protocoll ends in v } is a monochromatic rectangle ◮ A protocol exchanging k bits ⇒ rank( f ) ≤ 2 k ◮ CC ( f ) ≥ log rank( f )
Relation to rank ◮ CC ( f ) ≥ log rank( f ) ◮ CC ( f ) ≤ rank( f ).
Relation to rank ◮ CC ( f ) ≥ log rank( f ) ◮ CC ( f ) ≤ rank( f ). Conjecture (Lov´ asz & Saks ’88) CC ( f ) ≤ (log rank( f )) O (1) .
Relation to rank ◮ CC ( f ) ≥ log rank( f ) ◮ CC ( f ) ≤ rank( f ). Conjecture (Lov´ asz & Saks ’88) CC ( f ) ≤ (log rank( f )) O (1) . Theorem (Lovett ’14) CC ( f ) ≤ ˜ � O ( rank( f )) .
Relation to rank ◮ CC ( f ) ≥ log rank( f ) ◮ CC ( f ) ≤ rank( f ). Conjecture (Lov´ asz & Saks ’88) CC ( f ) ≤ (log rank( f )) O (1) . Theorem (Lovett ’14) CC ( f ) ≤ ˜ � O ( rank( f )) . ◮ Here: A much shorter and direct proof by me.
The technical main result ◮ It suffices to show Lemma Any 0/1 matrix A 1 0 1 0 1 0 1 0 0 1 1 1 1 0 1 0 1 1 1 1 0 1 1 1 0 1 1 1 1 0 1 0 A 0 1 0 1 0 1 0 1 0 1 1 1 1 0 1 0 1 1 1 1 0 0 1 1 1 0 1 0 1 0 1 0
The technical main result ◮ It suffices to show Lemma Any 0/1 matrix A has a monochromatic submatrix R of size O ( √ rank( A )) · | A | | R | ≥ 2 − ˜ 1 0 1 0 1 0 1 0 0 1 1 1 1 0 1 0 1 1 1 1 0 1 1 1 0 1 1 1 1 0 1 0 A 0 1 0 1 0 1 0 1 0 1 1 1 1 0 1 0 1 1 1 1 0 0 1 1 1 0 1 0 1 0 1 0
The technical main result ◮ It suffices to show Lemma Any 0/1 matrix A has a almost monochromatic submatrix R of size O ( √ rank( A )) · | A | | R | ≥ 2 − ˜ ◮ Almost means #zeroes 1 #ones ≤ 8 · rank( A ) 1 0 1 0 1 0 1 0 0 1 1 1 1 0 1 0 1 1 1 1 0 1 1 1 0 1 1 1 1 0 1 0 A 0 1 0 1 0 1 0 1 0 1 1 1 1 0 1 0 1 1 1 1 0 0 1 1 1 0 1 0 1 0 1 0
Thoughts about rank Let A be a 0 / 1 matrix of rank r . By definition, u i , v j ∈ R r A ij = � u i , v j � with
Thoughts about rank Let A be a 0 / 1 matrix of rank r . By definition, u i , v j ∈ R r A ij = � u i , v j � with v j u i
Thoughts about rank Let A be a 0 / 1 matrix of rank r . By definition, u i , v j ∈ R r A ij = � u i , v j � with v j u i ◮ For any regular matrix T : u ′ j := ( T − 1 ) T v j i := Tu i & v ′ � � ⇒ u ′ i , v ′ = � u i , v j � j
Thoughts about rank Let A be a 0 / 1 matrix of rank r . By definition, u i , v j ∈ R r A ij = � u i , v j � with v j u i ◮ For any regular matrix T : u ′ j := ( T − 1 ) T v j i := Tu i & v ′ � � ⇒ u ′ i , v ′ = � u i , v j � j Lemma Vectors can be chosen so that � u i � 2 , � v j � 2 ≤ r 1 / 4 ∀ i, j
John’s theorem John’s Theorem For any symmetric convex body K ⊆ R n , there is an ellipsoid E so that E ⊆ K ⊆ √ n · E . K 0
John’s theorem John’s Theorem For any symmetric convex body K ⊆ R n , there is an ellipsoid E so that E ⊆ K ⊆ √ n · E . K 0 E
John’s theorem John’s Theorem For any symmetric convex body K ⊆ R n , there is an ellipsoid E so that E ⊆ K ⊆ √ n · E . K √ n · E 0 E ◮ T : R n → R n linear map ⇒ T (ball) is an ellipsoid
John’s Theorem (2) Proof: K 0
John’s Theorem (2) Proof: K 0 ◮ Suppose the maximum volume ellipsoid in K is a unit ball.
John’s Theorem (2) Proof: K 0 x ◮ Suppose the maximum volume ellipsoid in K is a unit ball. ◮ Suppose some point x ∈ K has � x � 2 > √ n
John’s Theorem (2) Proof: K 0 x ◮ Suppose the maximum volume ellipsoid in K is a unit ball. ◮ Suppose some point x ∈ K has � x � 2 > √ n
John’s Theorem (2) Proof: K 0 E x ◮ Suppose the maximum volume ellipsoid in K is a unit ball. ◮ Suppose some point x ∈ K has � x � 2 > √ n ◮ Stretch ball along x ; shrink orthogonal
John’s Theorem (2) Proof: K 0 E x ◮ Suppose the maximum volume ellipsoid in K is a unit ball. ◮ Suppose some point x ∈ K has � x � 2 > √ n ◮ Stretch ball along x ; shrink orthogonal ◮ vol( E ) > vol(ball)
Rescaling vectors ◮ Given r -dim. vectors with � u i , v j � ∈ { 0 , 1 }
Rescaling vectors u i K 0 − u i ◮ Given r -dim. vectors with � u i , v j � ∈ { 0 , 1 } ◮ Choose K := conv {± u i | i row index }
Recommend
More recommend