randomness in computing
play

Randomness in Computing L ECTURE 19 Last time Finding Hamiltonian - PowerPoint PPT Presentation

Randomness in Computing L ECTURE 19 Last time Finding Hamiltonian cycles in random graphs Today Probabilistic method 4/7/2020 Sofya Raskhodnikova;Randomness in Computing The probabilistic method To prove that an object with required


  1. Randomness in Computing L ECTURE 19 Last time • Finding Hamiltonian cycles in random graphs Today • Probabilistic method 4/7/2020 Sofya Raskhodnikova;Randomness in Computing

  2. The probabilistic method To prove that an object with required properties exists: 1. Define a distribution on objects. 2. Sample an object. 3. Prove that a sampled object has required properties with positive probability. • Sometimes proof of existence can be converted into efficient randomized constructions . • Sometimes they can be converted into deterministic constructions ( derandomization ). Sofya Raskhodnikova; Randomness in Computing 4/7/2020

  3. Method 1: The counting argument • 𝐿 𝑜 = complete graph on 𝑜 vertices ( 𝑜 -clique) Theorem If 𝑜 𝑙 ⋅ 2 − 𝑙 2 +1 < 1 then it is possible to color the edges of 𝐿 𝑜 with two colors so that no 𝐿 𝑙 is monochromatic. Proof: Define a random experiment: Color each edge of 𝑳 𝒐 independently and uniformly blue or red. Fix an ordering of the 𝑜 • 𝑙 different 𝑙 -cliques. Let 𝑁 𝑗 be the event that clique 𝑗 is monochromatic, for 𝑗 = 1, … , 𝑜 • 𝑙 Pr 𝑁 𝑗 = 2 ⋅ 2 − 𝑙 Union Bound 2 𝑜 𝑜 𝑙 Pr[𝑁 𝑗 ] = 𝑜 𝑙 ⋅ 2 − 𝑙 2 +1 < 1 𝑙 𝑁 𝑗 ≤ σ 𝑗=1 • Pr ⋃ 𝑗=1 • Probability of a coloring with no monochromatic 𝑙 -clique is > 0 . Sofya Raskhodnikova; Randomness in Computing 4/7/2020 Image by Richtom80 at English Wikipedia, CC BY-SA 3.0

  4. Converting an existence proof into an efficient randomized construction • Can we efficiently sample a coloring ? Yes • How many samples do we need to generate a coloring with no monochromatic 𝒍 -clique ? – Probability of success is 𝑞 = 1 − 𝑜 𝑙 ⋅ 2 − 𝑙 2 +1 – # of samples ∼ Geom( 𝑞) , expectation: 1/𝑞 – Want: 1/𝑞 to be polynomial in the problem size – If 𝑞 = 1 − 𝑝(1) , we get a Monte Carlo construction algorithm that errs w. p. 𝑝(1) . • To get a Las Vegas algorithm (always correct answers), we need a poly-time procedure for checking if the coloring is monochromatic . – If 𝑙 is constant, we can check that all 𝑜 𝑙 cliques are not monochromatic. Sofya Raskhodnikova; Randomness in Computing 4/7/2020

  5. Method 2: The expectation argument • It can’t be that everybody is better (or worse) than the average. Claim Let 𝑌 be a R.V. with 𝔽 𝑌 = 𝜈 . Then Pr 𝑌 ≥ 𝜈 > 0 and Pr 𝑌 ≤ 𝜈 > 0 . ≤ Proof: Suppose to the contrary that Pr 𝑌 ≥ 𝜈 = 0 . Then 𝜈 = 𝔽 𝑌 = ෍ 𝑦 Pr 𝑌 = 𝑦 𝑦 > < ෍ 𝜈 Pr 𝑌 = 𝑦 = 𝜈 ෍ Pr 𝑌 = 𝑦 = 𝜈, 𝑦 𝑦 a contradiction. Sofya Raskhodnikova; Randomness in Computing 4/7/2020

  6. Example: Finding a large cut Recall: • A cut in a graph 𝐻 = (𝑊, 𝐹) is a partition of 𝑊 into two nonempty sets. • The size of the cut is the number of edges that cross it. • Finding a max cut is NP-hard. Theorem Let 𝐻 be an undirected graph with 𝑛 edges. Then 𝐻 has a cut of size ≥ 𝑛/2. Sofya Raskhodnikova; Randomness in Computing 4/7/2020

  7. Example: Existence of a large cut Theorem Let 𝐻 be an undirected graph with 𝑛 edges. Then 𝐻 has a cut of size ≥ 𝑛/2. Proof: Construct sets 𝐵 and 𝐶 of vertices by assigning each vertex to 𝐵 or 𝐶 uniformly and independently at random. • For each edge 𝑓 , let 𝑌 𝑓 = ቊ1 if edge connects 𝐵 to 𝐶 0 otherwise 𝔽 𝑌 𝑓 = 1/2 • Let 𝑌 = # of edges crossing the cut. 1 𝑛 𝔽 [𝑌] = 𝔽 σ 𝑓∈𝐹 𝑌 𝑓 = σ 𝑓∈𝐹 𝔽 𝑌 𝑓 = 𝑛 ⋅ 2 = 2 There exists a cut (𝐵, 𝐶) of size 𝑛/2 . Sofya Raskhodnikova; Randomness in Computing 4/7/2020

  8. Example: Finding a large cut • It is easy to choose a random cut 𝑛 • Probability of success: 𝑞 = Pr 𝑌 ≥ 2 • An upper bound on 𝑌 ? X ≤ 𝑛 𝑛 2 = 𝔽 𝑌 = ෍ 𝑗 ⋅ Pr[𝑌 = 𝑗] + ෍ 𝑗 ⋅ Pr[𝑌 = 𝑗] 𝑗<𝑛/2 𝑗≥𝑛/2 𝑛 − 1 ≤ ⋅ 1 − 𝑞 + 𝑛 ⋅ 𝑞 2 𝑛 ≤ 𝑛 − 1 − 𝑛 − 1 ⋅ 𝑞 + 2𝑛 ⋅ 𝑞 1 𝑞 ≥ 𝑛 + 1 ≤ 𝑛 + 1 • Expected # of samples to find a large cut: 𝑛 • Can test if a cut has ≥ 2 edges by counting edges crossing the cut (poly time) Las Vegas Sofya Raskhodnikova; Randomness in Computing 4/7/2020

  9. Derandomization: conditional expectations Finding a large cut Idea: Place each vertex deterministically, ensuring that 𝔽 𝑌| placement so far ≥ 𝔽 𝑌 ≥ 𝑛 2 • R.V. 𝑍 𝑗 is 𝐵 or 𝐶 , indicating which set vertex 𝑗 is placed in, ∀𝑗 ∈ [𝑜] By symmetry (it doesn’t Base case: 𝔽 𝑌|𝑍 1 = 𝐵 = 𝔽 𝑌|𝑍 1 = 𝐶 = 𝔽 𝑌 matter where the first node is) Inductive step: Let 𝑧 1 , … , 𝑧 𝑙 be placements so far (each is 𝐵 or 𝐶 ) and suppose 𝔽 𝑌|𝑍 1 = 𝑧 1 , … , 𝑍 𝑙 = 𝑧 𝑙 ≥ 𝔽 𝑌 . By Law of Total Expectation 𝑙 = 𝑧 𝑙 = 1 𝔽 𝑌|𝑍 1 = 𝑧 1 , … , 𝑍 2 𝔽 𝑌|𝑍 1 = 𝑧 1 , … , 𝑍 𝑙 = 𝑧 𝑙 , 𝑍 𝑙+1 = 𝐵 + 1 Pick 𝒛 𝒍+𝟐 to maximize Pick 𝒛 𝒍+𝟐 to maximize 2 𝔽 𝑌|𝑍 1 = 𝑧 1 , … , 𝑍 𝑙 = 𝑧 𝑙 , 𝑍 𝑙+1 = 𝐶 conditional probability conditional probability Then 𝔽 𝑌|𝑍 1 = 𝑧 1 , … , 𝑍 𝑙+1 = 𝑧 𝑙+1 ≥ 𝔽 𝑌|𝑍 1 = 𝑧 1 , … , 𝑍 𝑙 = 𝑧 𝑙 ≥ 𝔽 𝑌 Sofya Raskhodnikova; Randomness in Computing 4/7/2020

  10. Finding a large cut: derandomization When the dust settles 𝐵 𝐶 undecided 1/2 1 1 0 𝒍 + 𝟐 • Place vertex 𝑙 + 1 on the side with fewer neighbors, breaking ties arbitrarily Sofya Raskhodnikova; Randomness in Computing 4/7/2020

  11. Example 2: Maximum satisfiability (MAX-SAT) Logical formulas • Boolean variables: variables that can take on values T/F (or 1/0) • Boolean operations: ∨, ∧, and ¬ • Boolean formula: expression with Boolean variables and ops SAT (deciding if a given formula has a satisfying assignment) is NP-complete • 𝑦 𝑗 or ഥ 𝑦 𝑗 Literal: A Boolean variable or its negation. • 𝐷 1 = 𝑦 1 ∨ 𝑦 2 ∨ 𝑦 3 Clause: OR of literals. • Conjunctive normal form (CNF): AND of clauses. 𝐷 1 ∧ 𝐷 2 ∧ 𝐷 3 ∧ 𝐷 4            x 1  x 2  x 3 x 1  x 2  x 3 x 2  x 3 x 1  x 2  x 3 Ex: x 1 = 1, x 2 = 1, x 3 = 0 satisfies the formula. ฀ MAX-SAT: Given a CNF formula, find an assignment satisfying as many clauses as possible. • Assume no clause contains 𝑦 and ҧ 𝑦 (o.w., it is always satisfied). Sofya Raskhodnikova; Randomness in Computing 4/7/2020

  12. Example 2: MAX-SAT Theorem Given 𝑛 clauses, let 𝑙 𝑗 = # literals in clause 𝑗, for 𝑗 ∈ [𝑛] . Let 𝑙 = min 𝑗∈[𝑛] 𝑙 𝑗 . There is a truth assignment that satisfies at least ≥ 𝑛 1 − 2 −𝑙 clauses. 1 − 2 −𝑙 𝑗 ෍ 𝑗∈[𝑛] Proof: Assign values 0 or 1 uniformly and independently to each variable. • 𝑌 𝑗 = indicator R.V. for clause 𝑗 being satisfied. • 𝑌 = # of satisfied clauses = σ 𝑗∈[𝑛] 𝑌 𝑗 • Pr 𝑌 𝑗 = 1 = 1 − 2 − 𝑙 𝑗 (1 − 2 − 𝑙 𝑗 ) ≥ 𝑛(1 − 2 −𝑙 ) 𝔽 [𝑌] = ෍ 𝔽 𝑌 𝑗 = ෍ 𝑗∈[𝑛] 𝑗∈[𝑛] • There exists an assignment satisfying that many clauses. Sofya Raskhodnikova; Randomness in Computing 4/7/2020

  13. Example 3: Large sum-free subset • Given a set 𝑩 of positive integers, a sum-free subset 𝑻 ⊆ 𝑩 contains no three elements 𝒋 , 𝒌 , 𝒍 ∈ 𝑻 satisfying 𝒋 + 𝒌 = 𝒍 . • Goal: find as large as 𝑻 as possible. • Examples: A = {2, 3, 4, 5, 6, 8, 10} A = {1, 2, 3, 4, 5, 6, 8, 9, 10, 18} Theorem Every set 𝑩 of 𝒐 positive integers contains a sum-free subset of size greater than 𝒐/3 . Sofya Raskhodnikova; Randomness in Computing 4/7/2020

  14. Finding a large sum-free subset A randomized algorithm 1. Let 𝒒 > max element of 𝑩 be a prime, where 𝒒 = 𝟒𝒍 + 𝟑 . //The other choice, 𝟒𝒍 + 𝟐 , would also work . 2. Select a number 𝒓 uniformly at random from { 1, 𝒒 − 𝟐 }. 3. Map each element 𝒖 ∈ 𝑩 to 𝒖𝒓 mod 𝒒 . 𝑻  all elements of 𝑩 that got mapped to { 𝒍 + 𝟐, … , 𝟑𝒍 + 𝟐 }. 4. 5. Return 𝑻 . Need to prove : • 𝑻 is sum-free • The expected number of elements from 𝑩 that are mapped to { 𝒍 + 𝟐, … , 𝟑𝒍 + 𝟐 } is > 𝒐/𝟒 . Sofya Raskhodnikova; Randomness in Computing; based on slides by Surender Baswana 4/7/2020

  15. Showing that 𝑻 is sum-free • Let 𝒋 and 𝒌 be any two elements in 𝑻 . • Say 𝒋 is mapped to 𝜷 ; 𝒌 is mapped to 𝜸 ; 𝜷 , 𝜸 ∈ 𝒍 + 𝟐, 2𝒍 + 𝟐 1 2 … 𝒍 𝒍 + 𝟐 … … … 𝟑𝒍 + 𝟐 𝜸 … 3𝒍 + 𝟐 𝜷 • Then 𝜷 = 𝒋𝒓 𝐧𝐩𝐞 𝒒 and 𝜸 = 𝒌𝒓 𝐧𝐩𝐞 𝒒 • We need to show that 𝒋 + 𝒌 , if present in 𝑩 , is not mapped to [ 𝒍 + 𝟐, 2𝒍 + 𝟐 ] . • 𝒋 + 𝒌 is mapped to ?? (𝜷 + 𝜸) 𝐧𝐩𝐞 𝒒 Argue that • (𝜷 + 𝜸) must be greater than 2𝒍 + 𝟐 . • If (𝜷 + 𝜸) > 𝒒 , then (𝜷 + 𝜸) 𝐧𝐩𝐞 𝒒 is at most 𝒍 . Sofya Raskhodnikova; Randomness in Computing; based on slides by Surender Baswana 16

Recommend


More recommend