Randomness in Computing L ECTURE 21 Last time • Probabilistic method • Sample and Modify • The Second Moment Method Today • Probabilistic method • The Second Moment Method • Conditional expectation inequality • Lovasz Local Lemma 4/9/2020 Sofya Raskhodnikova;Randomness in Computing
Threshold behavior in random graphs 𝑯 ∼ 𝑯(𝒐, 𝒒) For many properties 𝓠 , there exists function 𝑔 𝑜 s.t. when 𝑞 ≪ 𝑔 𝑜 , probability that 𝐻 has 𝓠 → 0 as 𝑜 → ∞ 1. when 𝑞 ≫ 𝑔 𝑜 , probability that 𝐻 has 𝓠 → 1 as 𝑜 → ∞ 2. (It holds for all nontrivial monotone properties.) Sofya Raskhodnikova; Randomness in Computing 4/9/2020
The 2 𝐨𝐞 moment method Theorem If 𝑌 is a random variable with 𝔽 𝑌 > 0 , then Pr 𝑌 = 0 ≤ Var 𝑌 2 𝔽 𝑌 Sofya Raskhodnikova; Randomness in Computing 4/9/2020
Example: having a 4-clique Theorem Let 𝐻 ∼ 𝐻 𝑜,𝑞 and 𝑞 ∗ = Pr 𝐻 has a 𝐿 4 . ≪ 𝑜 −2/3 , then 𝑞 ∗ → 0 as 𝑜 → ∞ = 𝒑 𝑜 −2/3 If 𝑞 1. then 𝑞 ∗ → 1 as 𝑜 → ∞ ≫ 𝑜 −2/3 , = 𝝏 𝑜 −2/3 If 𝑞 2. Proof: Let 𝑌 = number of 4-cliques in 𝐻 . For every subset 𝐷 of 4 nodes, let 𝑌 𝐷 be the indicator for 𝐷 being a 𝐿 4 . 𝔽 𝑌 𝐷 = 𝑜 4 ⋅ 𝑞 6 𝔽 𝑌 = 𝐷 𝑞 = 𝑝(𝑜 −2/3 ) 1. Markov 𝑞 ∗ = Pr 𝑌 ≥ 1 ≤ 𝔽 [X] = 𝔽 𝑌 1 ≤ 𝑜 4 4! ⋅ 𝑞 6 = 𝑜 4 4! ⋅ 𝑝 𝑜 −(2/3)⋅6 = 𝑜 4 4! ⋅ 𝑝 𝑜 −4 = 𝑝( 1) Sofya Raskhodnikova; Randomness in Computing 4/9/2020
Review question What is the expected number of copies of this graph in 𝐻 ∼ 𝐻 𝑜,𝑞 ? 𝑜 4 ⋅ 𝑞 6 A. B. 4 𝑜 4 ⋅ 𝑞 6 C. 4 𝑜 4 ⋅ 𝑞 5 (1 − 𝑞) D. 6 𝑜 4 ⋅ 𝑞 5 (1 − 𝑞) E. None of the above Sofya Raskhodnikova; Randomness in Computing 4/9/2020
Example: having a 4-clique Theorem Let 𝐻 ∼ 𝐻 𝑜,𝑞 and 𝑞 ∗ = Pr 𝐻 has a 𝐿 4 . ≪ 𝑜 −2/3 , then 𝑞 ∗ → 0 as 𝑜 → ∞ = 𝒑 𝑜 −2/3 If 𝑞 1. then 𝑞 ∗ → 1 as 𝑜 → ∞ ≫ 𝑜 −2/3 , = 𝝏 𝑜 −2/3 If 𝑞 2. Proof: Expected number of 4-cliques: 𝔽 𝑌 = 𝑜 4 ⋅ 𝑞 6 𝑞 = 𝜕(𝑜 −2/3 ) 2. 𝔽 𝑌 → ∞ as 𝑜 → ∞ 𝑫𝒑𝒘 𝑍, 𝑎 = 𝔽 𝑍 − 𝜈 𝑍 ⋅ 𝑎 − 𝜈 𝑎 2 Goal: Show Var 𝑌 ≪ 𝔽 𝑌 = 𝔽 𝑍𝑎 − 𝜈 𝑍 𝜈 𝑎 ≤ 𝔽 𝑍𝑎 Var 𝑌 = Var 𝑌 𝐷 = Var 𝑌 𝐷 + Cov[𝑌 𝐷 , 𝑌 𝐸 ] 𝐷 𝐷 𝐷≠𝐸 2 − ( 𝔽 𝑌 𝐷 ) 2 = 𝔽 𝑌 𝐷 − ( 𝔽 𝑌 𝐷 ) 2 = 𝑞 6 − 𝑞 12 ≤ 𝑞 6 Var 𝑌 𝐷 = 𝔽 𝑌 𝐷 Var 𝑌 𝐷 ≤ 𝑜 4 ⋅ 𝑞 6 = 𝑃(𝑜 4 𝑞 6 ) 𝐷 Sofya Raskhodnikova; Randomness in Computing 4/9/2020
Bounding the covariance 𝑫𝒑𝒘 𝒀 𝑫 , 𝒀 𝑬 ≤ 𝔽 𝒀 𝑫 ⋅ 𝒀 𝑬 Case 1: |𝐷 ∩ 𝐸| is 0 or 1 Sofya Raskhodnikova; Randomness in Computing 4/9/2020
Bounding the covariance 𝑫𝒑𝒘 𝒀 𝑫 , 𝒀 𝑬 ≤ 𝔽 𝒀 𝑫 ⋅ 𝒀 𝑬 Case 2: 𝐷 ∩ 𝐸 = 2 Sofya Raskhodnikova; Randomness in Computing 4/9/2020
Bounding the covariance 𝑫𝒑𝒘 𝒀 𝑫 , 𝒀 𝑬 ≤ 𝔽 𝒀 𝑫 ⋅ 𝒀 𝑬 Case 3: 𝐷 ∩ 𝐸 = 3 Sofya Raskhodnikova; Randomness in Computing 4/9/2020
Putting it all together Theorem Let 𝐻 ∼ 𝐻 𝑜,𝑞 and 𝑞 ∗ = Pr 𝐻 has a 𝐿 4 . then 𝑞 ∗ → 1 as 𝑜 → ∞ ≫ 𝑜 −2/3 , If 𝑞 = 𝝏 𝑜 −2/3 2. • Var 𝑌 ≤ Var σ 𝐷 𝑌 𝐷 = σ 𝐷 Var 𝑌 𝐷 + σ 𝐷≠𝐸 Cov 𝑌 𝐷 , 𝑌 𝐸 = O(𝑜 4 𝑞 6 + 𝑜 6 𝑞 11 + 𝑜 5 𝑞 9 ) • Pr 𝑌 = 0 ≤ Var 𝑌 2 𝔽 𝑌 Sofya Raskhodnikova; Randomness in Computing 4/9/2020
Conditional Expectation Inequality Theorem Let 𝑌 = σ 𝑗∈[𝑜] 𝑌 𝑗 , where each 𝑌 𝑗 is an indicator R.V. Then Pr 𝑌 𝑗 = 1 Pr 𝑌 > 0 ≥ 𝔽 𝑌 | 𝑌 𝑗 = 1 𝑗∈[𝑜] • Note that the indicators 𝑌 𝑗 need not be independent. Proof: Let Y = ቊ1/X if 𝑌 > 0; Then X Y = ቊ1 if 𝑌 > 0; 0 otherwise. 0 otherwise. Pr 𝑌 > 0 = 𝔽 𝑌Y Sofya Raskhodnikova; Randomness in Computing 4/9/2020
Conditional Expectation Inequality Theorem Let 𝑌 = σ 𝑗∈[𝑜] 𝑌 𝑗 , where each 𝑌 𝑗 is an indicator R.V. Then Y = ቊ1/X if 𝑌 > 0; Pr 𝑌 𝑗 = 1 Pr 𝑌 > 0 ≥ 0 otherwise. 𝔽 𝑌 | 𝑌 𝑗 = 1 𝑗∈[𝑜] Pr 𝑌 > 0 = 𝔽 𝑌Y Proof: Linearity of expectation 𝒀 = σ𝒀 𝒋 = 𝔽 𝑌 𝑗 Y = 𝔽 [𝑌 𝑗 𝑍] Law of Total Expectation 𝑗∈[𝑜] 𝑗∈[𝑜] = 𝟏 = 𝔽 [𝑌 𝑗 𝑍 𝑌 𝑗 = 1 ⋅ Pr 𝑌 𝑗 = 1 + 𝔽 [𝑌 𝑗 𝑍 𝑌 𝑗 = 0 ⋅ Pr 𝑌 𝑗 = 0 𝑗∈[𝑜] 𝒀 𝒋 = 𝟐 𝑗∈[𝑜] = 𝔽 [𝑍 𝑌 𝑗 = 1 ⋅ Pr 𝑌 𝑗 = 1 = 𝔽 [1/𝑌 𝑌 𝑗 = 1 ⋅ Pr 𝑌 𝑗 = 1 𝑗∈[𝑜] 𝑗∈[𝑜] Pr 𝑌 𝑗 = 1 By Jensen’s inequality for convex function 𝒈 𝒚 = 𝟐/𝒚 , ≥ 𝔽 1 1 𝔽 [𝑌 𝑌 𝑗 = 1 𝑌 ≥ 𝑗∈[𝑜] 𝔽 [𝑌] Sofya Raskhodnikova; Randomness in Computing 4/9/2020
𝑳 𝟓 thm, part 2: Alternative proof Theorem Let 𝐻 ∼ 𝐻 𝑜,𝑞 and 𝑞 ∗ = Pr 𝐻 has a 𝐿 4 . then 𝑞 ∗ → 1 as 𝑜 → ∞ ≫ 𝑜 −2/3 , = 𝝏 𝑜 −2/3 If 𝑞 2. Proof: Recall: 𝑌 𝐷 = the indicator for 𝐷 being a 𝐿 4 . Symmetry Conditional Expectation Inequality 𝑞 6 Pr 𝑌 C = 1 𝔽 𝑌 | 𝑌 𝐷 = 1 = 𝑜 Pr 𝑌 > 0 ≥ 4 𝔽 𝑌 | 𝑌 𝐷 = 1 𝒀 = σ𝒀 𝑫 ′ C Linearity of expectation 𝔽 𝑌 | 𝑌 𝐷 = 1 = 𝔽 [ 𝑌 C ′ | 𝑌 𝐷 = 1] = 𝔽 [𝑌 C ′ | 𝑌 𝐷 = 1] 𝒀 𝑫 ′ is a 0-1 R.V. 𝐷 ′ 𝐷 ′ Pr[𝑌 C ′ = 1 | 𝑌 𝐷 = 1] = |𝑫 ′ ∩ 𝑫| = 𝟐 𝑫 ′ ∩ 𝑫 = ∅ |𝑫 ′ ∩ 𝑫| = 𝟑 |𝑫 ′ ∩ 𝑫| = 𝟒 𝐷 ′ 𝑫 ′ = 𝑫 = 1 + 𝑜 − 4 𝑞 6 + 4 𝑜 − 4 𝑞 6 + 6 𝑜 − 4 𝑞 5 + 4 𝑜 − 4 𝑞 3 4 3 2 1 Sofya Raskhodnikova; Randomness in Computing 4/9/2020
Avoiding bad events • Let 𝐶 1 and 𝐶 2 be (bad) events over a common probability space. Q. If Pr 𝐶 1 < 1 and Pr 𝐶 2 < 1 , does it imply Pr 𝐶 1 ∩ 𝐶 2 > 0? (Is it possible to avoid both events)? A. Not necessarily. E.g., for a single coin flip, let 𝐶 1 = 𝐼, 𝐶 2 = 𝑈 Then Pr 𝐶 1 = Pr 𝐶 2 = 1/2 . But Pr 𝐶 1 ∩ 𝐶 2 = 0 Q. What if 𝐶 1 and 𝐶 2 are independent? A. Yes. Pr 𝐶 1 ∩ 𝐶 2 = Pr 𝐶 1 ⋅ Pr 𝐶 2 > 0 1 1 Q. What if Pr 𝐶 1 < 2 and Pr 𝐶 2 < 2 (but 𝐶 1 , 𝐶 2 are dependent)? A. Yes. By Union Bound, Pr 𝐶 1 ∪ 𝐶 2 ≤ Pr 𝐶 1 + Pr 𝐶 2 < 1. So, Pr 𝐶 1 ∩ 𝐶 2 > 0 . Sofya Raskhodnikova; Randomness in Computing 4/9/2020
Lovasz Local Lemma (LLL) LLL states that as long as 1. bad events 𝐶 1 , … , 𝐶 𝑜 have small probability, 2. t hey are not ``too dependent’’, there is a non-zero probability of avoiding all of them. • A dependency graph for events 𝐶 1 , … , 𝐶 𝑜 is a graph with vertex set [𝑜] and edge set 𝐹 , s.t. ∀𝑗 ∈ 𝑜 , event 𝐶 𝑗 is mutually independent of all events 𝐶 𝑗, 𝑘 ∉ 𝐹} . 𝑘 Lovasz Local Lemma Let 𝐶 1 , … , 𝐶 𝑜 be events over a common sample space s.t. max degree of the dependency graph of 𝐶 1 , … , 𝐶 𝑜 is at most 𝑒 1. 1 2. ∀𝑗 ∈ 𝑜 , Pr 𝐶 𝑗 ≤ 𝑓(𝑒+1) Then Prځ 𝑗∈ 𝑜 ഥ 𝐶 𝑗 > 0 Sofya Raskhodnikova; Randomness in Computing 4/9/2020
Example: Points on a circle 11𝑜 points are placed on a circle and colored with 𝑜 different colors, so that each color is applied to exactly 11 points. Prove: There exists a set of 𝑜 points, all colored differently, such that no two points in the set are adjacent. Sofya Raskhodnikova; Randomness in Computing 4/9/2020
Recommend
More recommend