The Probabilistic Method The Probabilistic Method Topics on Randomized Computation Topics on Randomized Computation Spring Semester Spring Semester Co.Re.Lab.� �N.T.U.A. N.T.U.A. Co.Re.Lab.
Overview Overview In the first part we will see simple methods (basically In the first part we will see simple methods (basically through examples) through examples) The counting method The counting method 1. 1. The first moment method The first moment method 2. 2. The deletion method The deletion method 3. 3. The second moment method The second moment method 4. 4. Derandomization with conditional probabilities with conditional probabilities Derandomization 5. 5. The second part is THE part(y part(y): ): The second part is THE General Lovasz Local Lemma General Lovasz Local Lemma 1. 1. Other (usual and helpful) forms of LLL Other (usual and helpful) forms of LLL 2. 2. Constructive proof of LLL Constructive proof of LLL 3. 3.
Counting Expanders Counting Expanders We will relay on We will relay on > ⇒ ∃ Pr( ( )) Q x 0 xQ x ( ) Definition: An ( (n,d,a,c n,d,a,c) OR ) OR� �concentrator concentrator is a bipartite is a bipartite Definition: An multigraph G(L,R,E) G(L,R,E) such that: such that: multigraph � Each vertex in Each vertex in L L has degree at most has degree at most d d � � For any For any ⊆ ≤ ⋅ → ≥ S L :| S | a n | N S ( ) | c S | | � Theorem: There is an integer n n 0 such that for all n>n n>n 0 Theorem: There is an integer 0 such that for all 0 there is an (n,18,1/3,2) OR (n,18,1/3,2) OR� �concentrator concentrator . . there is an We will choose a random graph from a suitable We will choose a random graph from a suitable probabilistic space and we will show that it has positive probabilistic space and we will show that it has positive probability of being an (n,18,1/3,2) OR (n,18,1/3,2) OR� �concentrator concentrator . . probability of being an
Counting Expanders Counting Expanders Proof: Our random bipartite graph will have Proof: Our random bipartite graph will have = ∪ � Vertex set Vertex set V L R � ∈ � Each Each “ “chooses chooses” ” d d times a neighbor (in times a neighbor (in R R ) uniformly ) uniformly v L � (multiple edges become one edge). (multiple edges become one edge). Let E E s be the event that a subset with s s vertices of vertices of L L has has Let s be the event that a subset with fewer than cs cs neighbors. neighbors. fewer than We will bound and then sum up for all the We will bound and then sum up for all the Pr[ E ] S values to get a bound on the probability of values to get a bound on the probability of ≤ s an failure failure Fix an of size s s and a of size and a of size cs cs . . Fix an of size ⊆ ⊆ S L T R
Counting Expanders Counting Expanders n � There are ways of choosing There are ways of choosing S S � s n � There are ways of choosing There are ways of choosing T T � cs ds cs � The probability that The probability that T T contains all neighbors of contains all neighbors of S S is is ≤ � n s − − ds s cs ds d c 1 n n cs ne ne cs s Thus Thus + − ≤ ≤ ≤ c 1 d c Pr[ E ] e c S s cs n s cs n n Simplifying for a=1/3, c=2, d=18 and using we get Simplifying for a=1/3, c=2, d=18 and using we get ≤ s an s s s s − − − − d c 1 d c 1 d 18 s 1 c 2 + ( ) ( ) + − + − c 1 3 ≤ ≤ ≤ ≤ c 1 d c c 1 d c Pr[ E ] e c e c 3 e 3 e S n 3 3 3 Summing up we get Summing up we get ∑ ≤ < Pr[ failure ] Pr[ E ] 1 S S
The First Moment Method The First Moment Method At first we design a “ “thought experiment thought experiment” ” in in At first we design a 1. 1. which a random process plays a role which a random process plays a role We analyze the random experiment and draw a We analyze the random experiment and draw a 2. 2. conclusion using the first moment principle the first moment principle : : conclusion using ≤ ⇒ ≤ > E X [ ] t Pr( X t ) 0
Example 1 Example 1 Theorem: Theorem: For any undirected graph G(V,E) G(V,E) with with n n vertices and vertices and m m edges there edges there For any undirected graph is a partition of the vertex set into two sets A, B is a partition of the vertex set into two sets A, B such that such that m ∈ ∈ ∧ ∈ ≥ |{{ , } u v E u | A v B }| 2 Proof: Proof: � Assign each vertex independently and equiprobably in either A Assign each vertex independently and equiprobably in either A � or B. or B. � Let X Let X {u,v} =1 when when { {u,v u,v} has endpoints in different sets and } has endpoints in different sets and {u,v} =1 � X {u,v} =0 otherwise: otherwise: = = ⇒ = X {u,v} =0 Pr[ X 1] 1/ 2 E X [ ] 1/ 2 { , } u v { , } u v � By linearity of expectations: By linearity of expectations: � ∑ = = E separated [| edges |] E X [ ] m / 2 { , } u v ∈ { , } u v E
Example 2 Example 2 Theorem: Theorem: For any set of m m clauses there is a truth assignment that satisfies at clauses there is a truth assignment that satisfies at For any set of least m/2 m/2 clauses. (a clause is ) clauses. (a clause is ) ∨¬ ∨ ∨ ∨ least ( x x x ... x ) 1 2 3 k Proof: Proof: � Independently set each variable Independently set each variable TRUE TRUE or or FALSE FALSE � � For each clause let For each clause let Z Z i =1 if the if the i i� �th th clause is satisfied and clause is satisfied and Z Z i =0 i =1 i =0 � otherwise otherwise − = = − 2 k � If the If the i i� �th th clause has clause has k k literals: literals: Pr( 1) 1 Z � i E Z ≥ � For every clause: For every clause: [ ] 1/ 2 � i � The expected number of satisfied clauses is The expected number of satisfied clauses is � m m m ∑ ∑ = ≥ E [ Z ] E Z [ ] i i 2 = = i 1 i 1
Example 3 Example 3 Theorem: Theorem: k clauses is Any instance of k k� �Sat Sat with with <2 <2 k clauses is satisfiable satisfiable Any instance of Proof: Proof: � Independently set each variable Independently set each variable TRUE TRUE or or FALSE FALSE � � For each clause let For each clause let Z Z i =0 if the if the i i� �th th clause is satisfied clause is satisfied i =0 � and Z Z i =1 otherwise: otherwise: and i =1 − = = 2 k Pr( Z 1) i � For every clause: For every clause: − = 2 k E Z [ ] � i � The expected number of unsatisfied clauses is The expected number of unsatisfied clauses is � m m ∑ ∑ − = = < k E [ Z ] E Z [ ] m 2 1 i i = = i 1 i 1
The Deletion Method The Deletion Method (“ “sample and modify sample and modify” ” method) method) ( We want to prove that a combinatorial We want to prove that a combinatorial object F F exist exist object At first we show that there exist an F F’ ’ very very At first we show that there exist an 1. 1. “close close” ” to to F. F. “ Then we change F F’ ’ to to F F and show that the and show that the Then we change 2. 2. probability of existence remains positive probability of existence remains positive
Turan’ ’s Theorem* s Theorem* Turan Theorem: Let G(V,E) G(V,E) be a graph. If be a graph. If |V|=n |V|=n and and |E|=nk/2 |E|=nk/2 then then ≥ Theorem: Let a G ( ) n / 2 k Proof: Using probabilistic arguments we will prove the existence of a of a Proof: Using probabilistic arguments we will prove the existence subset that has many more vertices than edges subset that has many more vertices than edges Deleting vertices corresponding to these edges we get an independent dent Deleting vertices corresponding to these edges we get an indepen set. set. Let S S be a subset of be a subset of V V containing each vertex with probability containing each vertex with probability p p (to be (to be Let fixed later). We have E[|S|]= E[|S|]=np np fixed later). We have Let G G’ ’ be the subgraph induced by be the subgraph induced by S S . . Let ∈ For every define Y Y e =1 if and if and Y ∈ Y e =0 otherwise. Then: otherwise. Then: For every define e =1 e =0 e E e E G ( ') = 2 E Y [ ] p e
Recommend
More recommend