18.175: Lecture 9 Borel-Cantelli and strong law Scott Sheffield MIT 1 18.175 Lecture 9
Outline Laws of large numbers: Borel-Cantelli applications Strong law of large numbers 2 18.175 Lecture 9
Outline Laws of large numbers: Borel-Cantelli applications Strong law of large numbers 3 18.175 Lecture 9
Borel-Cantelli lemmas S ∞ � First Borel-Cantelli lemma: If P ( A n ) < ∞ then n =1 P ( A n i.o. ) = 0. � Second Borel-Cantelli lemma: If A n are independent, then S ∞ P ( A n ) = ∞ implies P ( A n i.o. ) = 1. n =1 4 18.175 Lecture 9
Convergence in probability subsequential a.s. convergence Theorem: X n → X in probability if and only if for every � � subsequence of the X n there is a further subsequence converging a.s. to X . Main idea of proof: Consider event E n that X n and X differ � � by E . Do the E n occur i.o.? Use Borel-Cantelli. 5 18.175 Lecture 9
Pairwise independence example Theorem: Suppose A 1 , A 2 , . . . are pairwise independent and � � S n S P ( A n ) = ∞ , and write S n = 1 A i . Then the ratio i =1 S n / ES n tends a.s. to 1. Main idea of proof: First, pairwise independence implies � � that variances add. Conclude (by checking term by term) that Var S n ≤ ES n . Then Chebyshev implies P ( | S n − ES n | > δ ES n ) ≤ Var ( S n ) / ( δ ES n ) 2 → 0 , which gives us convergence in probability. Second, take a smart subsequence. Let � � n k = inf { n : ES n ≥ k 2 } . Use Borel Cantelli to get a.s. convergence along this subsequence. Check that convergence along this subsequence deterministically implies the non-subsequential convergence. 6 18.175 Lecture 9
Outline Laws of large numbers: Borel-Cantelli applications Strong law of large numbers 7 18.175 Lecture 9
Outline Laws of large numbers: Borel-Cantelli applications Strong law of large numbers 8 18.175 Lecture 9
General strong law of large numbers Theorem (strong law): If X 1 , X 2 , . . . are i.i.d. real-valued � � n − 1 random variables with expectation m and A n := n S i =1 X i are the empirical means then lim n →∞ A n = m almost surely. 9 18.175 Lecture 9
4 ] < ∞ Proof of strong law assuming E [ X 4 ] < ∞ . Not necessary, but simplifies proof. Assume K := E [ X � � 2 ] = E [ X 4 ] − E [ X 2 ] 2 ≥ 0, so E [ X 2 ] 2 ≤ K . Note: Var [ X � � The strong law holds for i.i.d. copies of X if and only if it � � holds for i.i.d. copies of X − µ where µ is a constant. So we may as well assume E [ X ] = 0. � � Key to proof is to bound fourth moments of A n . � � E [ A 4 ] = n − 4 E [ S 4 ] = n − 4 E [( X 1 + X 2 + . . . + X n ) 4 ]. � � n n Expand ( X 1 + . . . + X n ) 4 . Five kinds of terms: X i X j X k X l and � � 2 and X i X 3 and X 2 X 2 and X 4 . X i X j X k j i j i n The first three terms all have expectation zero. There are � � 2 of the fourth type and n of the last type, each equal to at o t most K . So E [ A 4 ] ≤ n − 4 n 6 + n K . n 2 S ∞ S ∞ S ∞ A 4 ] = E [ A 4 ] < ∞ . So A 4 < ∞ Thus E [ � � n =1 n n =1 n n =1 n (and hence A n → 0) with probability 1. 10 18.175 Lecture 9
General proof of strong law Suppose X k are i.i.d. with finite mean. Let Y k = X k 1 | X k |≤ k . � � Write T n = Y 1 + . . . + Y n . Claim: X k = Y k all but finitely often a.s. so suffices to show T n / n → µ . (Borel Cantelli, expectation of positive r.v. is area between cdf and line y = 1) Claim: S ∞ Var ( Y k ) / k 2 ≤ 4 E | X 1 | < ∞ . How to prove it? � � k =1 ∞ Observe: Var ( Y k ) ≤ E ( Y 2 ) = 2 yP ( | Y k | > y ) dy ≤ � � k 0 k 2 yP ( | X 1 | > y ) dy . Use Fubini (interchange sum/integral, 0 since everything positive) ∞ ∞ ∞ t 2 ) / k 2 ≤ t k − 2 E ( Y k 1 ( y < k ) 2 yP ( | X 1 | > y ) dy = 0 k =1 k =1 ∞ ∞ t k − 2 1 ( y < k ) 2 yP ( | X 1 | > y ) dy . 0 k =1 ∞ Since E | X 1 | = P ( | X 1 | > y ) dy , complete proof of claim by 0 k > y k − 2 ≤ 4. showing that if y ≥ 0 then 2 y S 11 18.175 Lecture 9
General proof of strong law Claim: S ∞ Var ( Y k ) / k 2 ≤ 4 E | X 1 | < ∞ . How to use it? � � k =1 Consider subsequence k ( n ) = [ α n ] for arbitrary α > 1. Using � � Chebyshev, if E > 0 then ∞ ∞ t P | T k ( n ) − ET k ( n ) | > E k ( n )) ≤ E − 1 t Var ( T k ( n ) ) / k ( n ) 2 n =1 n =1 k ( ) n ∞ ∞ − 2 t k ( n ) − 2 t Var ( Y m ) = E − 2 t t k ( n ) − 2 . = E Var ( Y m ) n =1 m =1 m =1 : k ( ) ≥ n n m Sum series: � � [ α n ] − 2 ≤ 4 n : α n ≥ m α − 2 n ≤ 4(1 − α − 2 ) − 1 m − 2 S S . n : α n ≥ m Combine computations (observe RHS below is finite): � � ∞ ∞ t P ( | T k ( n ) − ET k ( n ) | > E k ( n )) ≤ 4(1 − α − 2 ) − 1 E − 2 t E ( Y 2 ) m − 2 . m n =1 m =1 Since E is arbitrary, get ( T k ( n ) − ET k ( n ) ) / k ( n ) → 0 a.s. � � 12 18.175 Lecture 9
MIT OpenCourseWare http://ocw.mit.edu 18.175 Theory of Probability Spring 2014 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms.
MIT OpenCourseWare http://ocw.mit.edu 18.175 Theory of Probability Spring 2014 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms.
Recommend
More recommend