Estimation III: Method of Moments and Maximum Likelihood Stat 3202 @ OSU Dalpiaz 1
A Standard Setup iid Let X 1 , X 2 , . . . , X n ∼ Poisson( λ ) . That is f ( x | λ ) = λ x e − λ , x = 0 , 1 , 2 , . . . λ > 0 x ! How should we estimate λ ? 2
Population and Sample Moments The k th population moment of a RV (about the origin) is ′ � Y k � µ k = E The k th sample moment is n k = Y k = 1 ′ � Y k m i n i =1 3
The Method of Moments (MoM) The Method of Moments (MoM) consists of equating sample moments and population moments. If a population has t parameters, the MOM consists of solving the system of equations ′ ′ k = µ k , k = 1 , 2 , . . . , t m for the t parameters. 4
Example: Poisson iid Let X 1 , X 2 , . . . , X n ∼ Poisson( λ ) . That is f ( x | λ ) = λ x e − λ , x = 0 , 1 , 2 , . . . λ > 0 x ! Find a method of moments estimator of λ , call it ˜ λ . 5
Example: Normal, Two Unknowns Let X 1 , X 2 , . . . , X n be iid N ( θ, σ 2 ) . � θ, σ 2 � Use the method of moments to estimate the parameter vector . 6
Example: Normal, Mean Known Let X 1 , X 2 , . . . , X n be iid N (1 , σ 2 ) . Find a method of moments estimator of σ 2 , call it ˜ σ 2 . 7
8
Calculus??? 9
A Game Show / An Idea 10
Is a Coin Fair? Let Y ∼ binom( n = 100 , p ). Suppose we observe a single observation x = 60. 11
Log Rules • x m x n = x m + n • ( x m ) n = x mn • log( ab ) = log( a ) + log( b ) • log( a / b ) = log( a ) − log( b ) • log( a b ) = b log( a ) • � n i =1 x i = x 1 · x 2 · · · · · x n � a • � n �� n i =1 x a i = i =1 x i �� n = � n � • log i =1 x i i =1 log( x i ) 12
Example: Poisson iid Let X 1 , X 2 , . . . , X n ∼ Poisson( λ ) . That is f ( x | λ ) = λ x e − λ , x = 0 , 1 , 2 , . . . λ > 0 x ! Find the maximum likelihood estimator of λ , call it ˆ λ . 13
Example: Poisson iid Let X 1 , X 2 , . . . , X n ∼ Poisson( λ ) . That is f ( x | λ ) = λ x e − λ , x = 0 , 1 , 2 , . . . λ > 0 x ! Calculate the maximum likelihood estimate of λ , when x 1 = 1 , x 2 = 2 , x 3 = 4 , x 4 = 2 . 14
Maximum Likelihood Estimation (MLE) Given a random sample X 1 , X 2 , . . . , X n from a population with parameter θ and density or mass f ( x | θ ), we have: The Likelihood, L ( θ ), n � L ( θ ) = f ( x 1 , x 2 , . . . , x n ) = f ( x i | θ ) i =1 The Maximum Likelihood Estimator , ˆ θ ˆ θ = argmax L ( θ ) = argmax log L ( θ ) θ θ 15
Invariance Principle If ˆ θ is the MLE of θ and the function h ( θ ) is continuous, then h (ˆ θ ) is the MLE of h ( θ ). iid Let X 1 , X 2 , . . . , X n ∼ Poisson( λ ) . That is f ( x | λ ) = λ x e − λ , x = 0 , 1 , 2 , . . . λ > 0 x ! • Example: Find the maximum likelihood estimator of P [ X = 4], call it ˆ P [ X = 4]. Calculate an estimate using this estimator when x 1 = 1 , x 2 = 2 , x 3 = 4 , x 4 = 2 . 16
Some Brief History 17
Who Is This? 18
Who Is This? 19
Another Example Let X 1 , X 2 , . . . , X n iid from a population with pdf f ( x | θ ) = 1 θ x (1 − θ ) /θ , 0 < x < 1 , 0 < θ < ∞ Find the maximum likelihood estimator of θ , call it ˆ θ . 20
A Different Example Let X 1 , X 2 , . . . , X n iid from a population with pdf f ( x | θ ) = θ x 2 , 0 < θ ≤ x < ∞ Find the maximum likelihood estimator of θ , call it ˆ θ . 21
Example: Gamma Let X 1 , X 2 , . . . , X n ∼ iid gamma( α, β ) with α known. Find the maximum likelihood estimator of β , call it ˆ β . 22
More Practice Let Y 1 , Y 2 , . . . , Y n be a random sample from a distribution with pdf − y 2 f ( y | α ) = 2 � � α · y · exp , y > 0 , α > 0 . α Find the maximum likelihood estimator of α . 23
More Practice Suppose that a random variable X follows a discrete distribution, which is determined by a parameter θ which can take only two values , θ = 1 or θ = 2. The parameter θ is unknown. If θ = 1 , then X follows a Poisson distribution with parameter λ = 2. If θ = 2, then X follows a Geometric distribution with parameter p = 0 . 25. Now suppose we observe X = 3. Based on this data, what is the maximum likelihood estimate of θ ? 24
More More More 2 θ 2 θ ≤ y < ∞ iid y 3 Let Y 1 , Y 2 , . . . , Y n ∼ f ( y | θ ) = 0 otherwise Find the maximum likelihood estimator of θ . 25
Next Time • More examples? • Why does this work? • Why do we need both MLE and MoM? • How do we use these methods in practice? 26
Recommend
More recommend