Detection and Estimation Theory Lecture 8 Mojtaba Soltanalian- UIC msol@uic.edu http://msol.people.uic.edu Based on ECE 531 Slides- 2011 (Prof. Natasha Devroye)
Finding MVUE- what we discussed
Finding MVUE- what we discussed
Finding MVUE- the new roadmap
Sufficient Statistics
Sufficient Statistics Neyman-Fisher Factorization Theorem
Sufficient Statistics and MVUE
Sufficient Statistics -- Completeness Example
Sufficient Statistics -- MVUE Construction via Completeness
Rao-Blackwell-Lehmann-Scheffe (RBLS) Theorem
Rao-Blackwell-Lehmann-Scheffe (RBLS) Theorem Remarks: - Given any estimator f that is not a function of a sufficient statistic, there exists a better estimator if variance is concerned. - “The conditional expectation averages out (or removes) non- informative components in the original estimator. We can view this as a filter that eliminates unnecessary components of the data .”
Rao-Blackwell-Lehmann-Scheffe (RBLS) Theorem Proof: (for decreasing the variance)
Rao-Blackwell-Lehmann-Scheffe (RBLS) Theorem
RBLS Theorem and the MVUE The Rao-Blackwell Theorem paves the way for decreasing the variance of an unbiased estimator. The question that remains: when can we know that we have obtained the MVUE? Answer: When T is a complete sufficient statistic. In fact, Lehmann-Scheffe Theorem states that If T is complete, there is at most one unbiased estimator that is a function of T. Unique MVUE (UMVUE)
RBLS Theorem and the MVUE Let’s go back a little bit! RBLS
Vector Versions
Vector Versions
Further Examples (see example 5.8)
Further Examples (see example 5.10)
Recommend
More recommend