cs70 lecture 27 coupon collectors problem time to collect
play

CS70: Lecture 27. Coupon Collectors Problem. Time to collect - PowerPoint PPT Presentation

CS70: Lecture 27. Coupon Collectors Problem. Time to collect coupons Coupons; Independent Random Variables X -time to get n coupons. X 1 - time to get first coupon. Note: X 1 = 1. E ( X 1 ) = 1 . 1. Time to Collect Coupons X 2 - time to get


  1. CS70: Lecture 27. Coupon Collectors Problem. Time to collect coupons Coupons; Independent Random Variables X -time to get n coupons. X 1 - time to get first coupon. Note: X 1 = 1. E ( X 1 ) = 1 . 1. Time to Collect Coupons X 2 - time to get second coupon after getting first. Experiment: Get coupons at random from n until collect all n first coupon” ] = n − 1 Pr [ “get second coupon” | “got milk —- 2. Review: Independence of Events coupons. n Outcomes: { 123145 ..., 56765 ... } ⇒ E [ X 2 ] = 1 1 n E [ X 2 ]? Geometric ! ! ! = p = = n − 1 . 3. Independent RVs n − 1 Random Variable: X - length of outcome. n 4. Mutually independent RVs Pr [ “getting i th coupon | “got i − 1rst coupons” ] = n − ( i − 1 ) = n − i + 1 Before: Pr [ X ≥ n ln2 n ] ≤ 1 2 . n n E [ X i ] = 1 n p = n − i + 1 , i = 1 , 2 ,..., n . Today: E [ X ] ? E [ X 1 ]+ ··· + E [ X n ] = n n n − 2 + ··· + n n E [ X ] = n + n − 1 + 1 n ( 1 + 1 2 + ··· + 1 = n ) =: nH ( n ) ≈ n ( ln n + γ ) Review: Harmonic sum Harmonic sum: Paradox Paradox � n H ( n ) = 1 + 1 2 + ··· + 1 1 Consider this stack of cards (no glue!): n ≈ x dx = ln ( n ) . 1 . A good approximation is If each card has length 2, the stack can extend H ( n ) to the right of the table. As n increases, you can go as far as you want! H ( n ) ≈ ln ( n )+ γ where γ ≈ 0 . 58 (Euler-Mascheroni constant).

  2. Stacking Review: Independence of Events Independent Random Variables. Definition: Independence ◮ Events A , B are independent if Pr [ A ∩ B ] = Pr [ A ] Pr [ B ] . The random variables X and Y are independent if and only if ◮ Events A , B , C are mutually independent if A , B are independent, A , C are independent, B , C are Pr [ Y = b | X = a ] = Pr [ Y = b ] , for all a and b . independent and Pr [ A ∩ B ∩ C ] = Pr [ A ] Pr [ B ] Pr [ C ] . Fact: ◮ Events { A n , n ≥ 0 } are mutually independent if ... . X , Y are independent if and only if ◮ Example: X , Y ∈ { 0 , 1 } two fair coin flips ⇒ X , Y , X ⊕ Y are Pr [ X = a , Y = b ] = Pr [ X = a ] Pr [ Y = b ] , for all a and b . pairwise independent but not mutually independent. ◮ Example: X , Y , Z ∈ { 0 , 1 } three fair coin flips are mutually The cards have width 2. Induction shows that the center of gravity independent. Obvious. after n cards is H ( n ) away from the right-most edge. Independence: Examples A useful observation about independence Functions of Independent random Variables Theorem Theorem Functions of independent RVs are independent X and Y are independent if and only if Example 1 Let X , Y be independent RV. Then Roll two die. X , Y = number of pips on the two dice. X , Y are Pr [ X ∈ A , Y ∈ B ] = Pr [ X ∈ A ] Pr [ Y ∈ B ] for all A , B ⊂ ℜ . independent. f ( X ) and g ( Y ) are independent, for all f ( · ) , g ( · ) . Indeed: Pr [ X = a , Y = b ] = 1 36 , Pr [ X = a ] = Pr [ Y = b ] = 1 6 . Proof: Example 2 Proof: If ( ⇐ ): Choose A = { a } and B = { b } . Roll two die. X = total number of pips, Y = number of pips on die 1 Recall the definition of inverse image: minus number on die 2. X and Y are not independent. This shows that Pr [ X = a , Y = b ] = Pr [ X = a ] Pr [ Y = b ] . h ( z ) ∈ C ⇔ z ∈ h − 1 ( C ) := { z | h ( z ) ∈ C } . (1) Only if ( ⇒ ): Indeed: Pr [ X = 12 , Y = 1 ] = 0 � = Pr [ X = 12 ] Pr [ Y = 1 ] > 0. Now, Example 3 Pr [ X ∈ A , Y ∈ B ] Flip a fair coin five times, X = number of H s in first three flips, Y = = ∑ Pr [ X = a , Y = b ] = ∑ a ∈ A ∑ a ∈ A ∑ Pr [ f ( X ) ∈ A , g ( Y ) ∈ B ] Pr [ X = a ] Pr [ Y = b ] number of H s in last two flips. X and Y are independent. = Pr [ X ∈ f − 1 ( A ) , Y ∈ g − 1 ( B )] , by ( ?? ) b ∈ B b ∈ B Indeed: = ∑ Pr [ X = a ] Pr [ Y = b ]] = ∑ [ ∑ Pr [ X = a ][ ∑ Pr [ Y = b ]] = Pr [ X ∈ f − 1 ( A )] Pr [ Y ∈ g − 1 ( B )] , since X , Y ind. a ∈ A b ∈ B a ∈ A b ∈ B � 3 �� 2 � � 3 � � 2 � 2 − 5 = 2 − 2 = Pr [ X = a ] Pr [ Y = b ] . = Pr [ f ( X ) ∈ A ] Pr [ g ( Y ) ∈ B ] , by ( ?? ) . 2 − 3 × = ∑ Pr [ X = a , Y = b ] = Pr [ X = a ] Pr [ Y ∈ B ] = Pr [ X ∈ A ] Pr [ Y ∈ B ] . a b a b a ∈ A

  3. Mean of product of independent RV Examples Mutually Independent Random Variables Theorem (1) Assume that X , Y , Z are (pairwise) independent, with Let X , Y be independent RVs. Then Definition E [ X ] = E [ Y ] = E [ Z ] = 0 and E [ X 2 ] = E [ Y 2 ] = E [ Z 2 ] = 1. X , Y , Z are mutually independent if E [ XY ] = E [ X ] E [ Y ] . Then Pr [ X = x , Y = y , Z = z ] = Pr [ X = x ] Pr [ Y = y ] Pr [ Z = z ] , for all x , y , z . E [( X + 2 Y + 3 Z ) 2 ] = E [ X 2 + 4 Y 2 + 9 Z 2 + 4 XY + 12 YZ + 6 XZ ] Proof: = 1 + 4 + 9 + 4 × 0 + 12 × 0 + 6 × 0 Recall that E [ g ( X , Y )] = ∑ x , y g ( x , y ) Pr [ X = x , Y = y ] . Hence, Theorem = 14 . The events A , B , C ,... are pairwise (resp. mutually) independent iff = ∑ xyPr [ X = x , Y = y ] = ∑ E [ XY ] xyPr [ X = x ] Pr [ Y = y ] , by ind. (2) Let X , Y be independent and U [ 1 , 2 ,... n ] . Then the random variables 1 A , 1 B , 1 C ,... are pairwise (resp. mutually) x , y x , y independent. = ∑ xyPr [ X = x ] Pr [ Y = y ]] = ∑ E [ X 2 + Y 2 − 2 XY ] = 2 E [ X 2 ] − 2 E [ X ] 2 [ ∑ [ xPr [ X = x ]( ∑ yPr [ Y = y ])] E [( X − Y ) 2 ] = Proof: x y x y Pr [ 1 A = 1 , 1 B = 1 , 1 C = 1 ] = Pr [ A ∩ B ∩ C ] ,... 1 + 3 n + 2 n 2 − ( n + 1 ) 2 = ∑ = [ xPr [ X = x ] E [ Y ]] = E [ X ] E [ Y ] . . 3 2 x Functions of pairwise independent RVs A Little Lemma Functions of mutually independent RVs One has the following result: Let X 1 , X 2 ,..., X 11 be mutually independent random variables. Define Theorem Y 1 = ( X 1 ,..., X 4 ) , Y 2 = ( X 5 ,..., X 8 ) , Y 3 = ( X 9 ,..., X 11 ) . Then Functions of disjoint collections of mutually independent random If X , Y , Z are pairwise independent, but not mutually independent, it variables are mutually independent. Pr [ Y 1 ∈ B 1 , Y 2 ∈ B 2 , Y 3 ∈ B 3 ] = Pr [ Y 1 ∈ B 1 ] Pr [ Y 2 ∈ B 2 ] Pr [ Y 3 ∈ B 3 ] . may be that Example: Let { X n , n ≥ 1 } be mutually independent. Then, f ( X ) and g ( Y , Z ) are not independent . Proof: Y 1 := X 1 X 2 ( X 3 + X 4 ) 2 , Y 2 := max { X 5 , X 6 }− min { X 7 , X 8 } , Y 3 := X 9 cos ( X 10 + X 11 ) Example 1: Flip two fair coins, are mutually independent. Proof: X = 1 { coin 1 is H } , Y = 1 { coin 2 is H } , Z = X ⊕ Y . Then, X , Y , Z are Pr [ Y 1 ∈ B 1 , Y 2 ∈ B 2 , Y 3 ∈ B 3 ] Let B 1 := { ( x 1 , x 2 , x 3 , x 4 ) | x 1 x 2 ( x 3 + x 4 ) 2 ∈ A 1 } . Similarly for B 2 , B 2 . pairwise independent. Let g ( Y , Z ) = Y ⊕ Z . Then g ( Y , Z ) = X is not ∑ = Pr [ Y 1 = y 1 , Y 2 = y 2 , Y 3 = y 3 ] independent of X . Then y 1 ∈ B 1 , y 2 ∈ B 2 , y 3 ∈ B 3 Example 2: Let A , B , C be pairwise but not mutually independent in a ∑ Pr [ Y 1 ∈ A 1 , Y 2 ∈ A 2 , Y 3 ∈ A 3 ] = Pr [ Y 1 = y 1 ] Pr [ Y 2 = y 2 ] Pr [ Y 3 = y 3 ] way that A and B ∩ C are not independent. Let y 1 ∈ B 1 , y 2 ∈ B 2 , y 3 ∈ B 3 = Pr [( X 1 ,..., X 4 ) ∈ B 1 , ( X 5 ,..., X 8 ) ∈ B 2 , ( X 9 ,..., X 11 ) ∈ B 3 ] X = 1 A , Y = 1 B , Z = 1 C . Choose f ( X ) = X , g ( Y , Z ) = YZ . = { ∑ Pr [ Y 1 = y 1 ] }{ ∑ Pr [ Y 2 = y 2 ] }{ ∑ Pr [ Y 3 = y 3 ] } = Pr [( X 1 ,..., X 4 ) ∈ B 1 ] Pr [( X 5 ,..., X 8 ) ∈ B 2 ] Pr [( X 9 ,..., X 11 ) ∈ B 3 ] y 1 ∈ B 1 y 2 ∈ B 2 y 3 ∈ B 3 by little lemma = Pr [ Y 1 ∈ B 1 ] Pr [ Y 2 ∈ B 2 ] Pr [ Y 3 ∈ B 3 ] . = Pr [ Y 1 ∈ A 1 ] Pr [ Y 2 ∈ A 2 ] Pr [ Y 3 ∈ A 3 ]

  4. Operations on Mutually Independent Events Product of mutually independent RVs Summary. Theorem Theorem Operations on disjoint collections of mutually independent events Let X 1 ,..., X n be mutually independent RVs. Then, produce mutually independent events. Coupons; Independent Random Variables E [ X 1 X 2 ··· X n ] = E [ X 1 ] E [ X 2 ] ··· E [ X n ] . For instance, if A , B , C , D , E are mutually independent, then A ∆ B , C \ D , ¯ E are mutually independent. ◮ Expected time to collect n coupons is nH ( n ) ≈ n ( ln n + γ ) Proof: Proof: ◮ X , Y independent ⇔ Pr [ X ∈ A , Y ∈ B ] = Pr [ X ∈ A ] Pr [ Y ∈ B ] 1 A ∆ B = f ( 1 A , 1 B ) where Assume that the result is true for n . (It is true for n = 2.) ◮ Then, f ( X ) , g ( Y ) are independent f ( 0 , 0 ) = 0 , f ( 1 , 0 ) = 1 , f ( 0 , 1 ) = 1 , f ( 1 , 1 ) = 0 Then, with Y = X 1 ··· X n , one has 1 C \ D = g ( 1 C , 1 D ) where and E [ XY ] = E [ X ] E [ Y ] E [ X 1 ··· X n X n + 1 ] = E [ YX n + 1 ] , g ( 0 , 0 ) = 0 , g ( 1 , 0 ) = 1 , g ( 0 , 1 ) = 0 , g ( 1 , 1 ) = 0 ◮ Mutual independence .... = E [ Y ] E [ X n + 1 ] , 1 ¯ E = h ( 1 E ) where ◮ Functions of mutually independent RVs are mutually h ( 0 ) = 1 and h ( 1 ) = 0. because Y , X n + 1 are independent independent. = E [ X 1 ] ··· E [ X n ] E [ X n + 1 ] . Hence, 1 A ∆ B , 1 C \ D , 1 ¯ E are functions of mutually independent RVs. Thus, those RVs are mutually independent. Consequently, the events of which they are indicators are mutually independent.

Recommend


More recommend