outline
play

Outline Review Practice Problems! Review Time! Random Variables - PowerPoint PPT Presentation

Outline Review Practice Problems! Review Time! Random Variables Joint Distributions Joint RV Statistics Conditional Distribution General Inference Practice Problems! Probability Distributions Expectation &


  1. Outline • Review • Practice Problems!

  2. Review Time! • Random Variables • Joint Distributions • Joint RV Statistics • Conditional Distribution • General Inference • Practice Problems!

  3. Probability Distributions

  4. Expectation & Variance Discrete definition Continuous definition Wait for it… 𝐹 𝑌 = $ 𝑞 𝑦 ⋅ 𝑦 !:# ! $%

  5. Expectation & Variance Discrete definition Continuous definition Wait for it… 𝐹 𝑌 = $ 𝑞 𝑦 ⋅ 𝑦 !:# ! $% Properties of Expectation Properties of Variance Var ( X ) = E [( X − μ ) 2 ] E [ X + Y ] = E [ X ] + E [ Y ] Var ( X ) = E [ X 2 ] − E [ X ] 2 E [ aX + b ] = aE [ X ] + b Var ( aX + b ) = a 2 Var ( X ) 𝐹 𝑕 𝑌 = % 𝑕 𝑦 𝑞(𝑦) !

  6. All our (discrete) friends NegBin Ber(p) Bin(n, p) Poi(λ) Geo(p) (r, p) 𝜇 ! 𝑓 "# 𝑜 (1 − p ) k − 1 p 𝑙 − 1 𝑙 𝑞 ! 1 − 𝑞 "#! P(X) = p 𝑠 − 1 𝑞 $ 1 − 𝑞 !#$ 𝑙! E[X] = E[X] = E[X] = p E[X] = np E[X] = λ 1 / p r / p 1 − p r (1 − p ) Var(X) = Var(X) = Var(X) = λ p 2 p 2 p(1-p) np(1-p) 1 experiment Number of Number of Number of n independent with prob p of success over independent independent trials with prob p of success experiment trials until first trials until r success duration, λ rate success successes of success

  7. Probability Distributions

  8. All our (continuous) friends For continuous RVs, we need to calculate the PDF, instead of the PMF PDF for RV X 𝑔 𝑦 ≥ 0 such that −∞ < 𝑦 < ∞ " 𝑄 𝑏 ≤ 𝑦 ≤ 𝑐 = - 𝑔 𝑦 𝑒𝑦 !

  9. Expectation & Variance Discrete definition Continuous definition " 𝐹 𝑌 = $ 𝑞 𝑦 ⋅ 𝑦 𝐹 𝑌 = $ 𝑦 ⋅ 𝑔 𝑦 𝑒𝑦 !:# ! $% !

  10. Expectation & Variance Discrete definition Continuous definition " 𝐹 𝑌 = $ 𝑦 ⋅ 𝑔 𝑦 𝑒𝑦 𝐹 𝑌 = $ 𝑞 𝑦 ⋅ 𝑦 ! !:# ! $% Properties of Expectation Properties of Variance Var ( X ) = E [( X − μ ) 2 ] E [ X + Y ] = E [ X ] + E [ Y ] Var ( X ) = E [ X 2 ] − E [ X ] 2 E [ aX + b ] = aE [ X ] + b Var ( aX + b ) = a 2 Var ( X ) 𝐹 𝑕 𝑌 = % 𝑕 𝑦 𝑞(𝑦) !

  11. All our (continuous) friends Uni(α, β) Exp(λ) N(μ, σ 2 ) f ( x ) = λ e − λ x ! "!# $ 𝟐 ! "√$% 𝑓 f ( x ) = $%$ 𝒈 𝒚 = 𝜸 − 𝜷 F ( x ) = 1 − e − λ x 𝑮 𝒚 = 𝝔 𝒚 − 𝝂 P ( a ≤ X ≤ b )= &'( 𝝉 )'* 𝐅[𝒀] = 𝜷 + 𝜸 E[X] = 1 / λ E[X] = μ 𝟑 $ Var ( x ) = !"# ! Var ( x ) = Var ( x ) = σ 2 & ! $% Duration of time until success occurs. λ is rate of success

  12. Approximations When can we approximate a binomial? • Poisson • n > 20 • p is small • λ = np is moderate • n > 20 and p < 0.05 • n > 100 and p < 0.1 • Slight dependence ok • Normal • n > 20 • p is moderate • np(1-p)> 10 • Independent trials

  13. Continuity correction

  14. Joint Distributions – Discrete p x , y ( a , b ) = P ( X = a , Y = b ) P x ( a ) = ∑ ) 𝑄 *,) (𝑏, 𝑧)

  15. Multinomial RVs Joint PMF 𝑜 . " … 𝑞 - . ! , 𝑞 , . # 𝑄 𝑌 + = 𝑑 + , 𝑌 , = 𝑑 , , … , 𝑌 - = 𝑑 - = 𝑞 + 𝑑 + , 𝑑 , … , 𝑑 - - 𝑑 / = 𝑜 Where ∑ /0% Generalize to Binomial RVs

  16. Independent Discrete RVs Two discrete random variables X and Y are independent if for all x,y: 𝑄 𝑌 = 𝑦, 𝑍 = 𝑧 = 𝑄 𝑌 = 𝑦 𝑄(𝑍 = 𝑧) Sum of independent Binomials 𝑌 + 𝑍~𝐶𝑗𝑜 𝑜 + + 𝑜 , , 𝑞 Sum of independent Poisson RVs 𝑌 + 𝑍~𝑄𝑝𝑗(𝜇 + + 𝜇 , )

  17. Covariance 𝐷𝑝𝑤 𝑌, 𝑍 = 𝐹 𝑌 − 𝐹 𝑌 𝑍 − 𝐹 𝑍 = 𝐹 𝑌𝑍 − 𝐹 𝑌 𝐹[𝑍]

  18. Covariance 𝐷𝑝𝑤 𝑌, 𝑍 = 𝐹 𝑌 − 𝐹 𝑌 𝑍 − 𝐹 𝑍 = 𝐹 𝑌𝑍 − 𝐹 𝑌 𝐹[𝑍] How do you calculate variance of two RVs? 𝑊𝑏𝑠 𝑌 + 𝑍 = 𝑊𝑏𝑠 𝑌 + 2 ⋅ 𝐷𝑝𝑤 𝑌, 𝑍 + 𝑊𝑏𝑠 𝑍

  19. Covariance 𝑊𝑏𝑠 𝑌 + 𝑍 = 𝑊𝑏𝑠 𝑌 + 2 ⋅ 𝐷𝑝𝑤 𝑌, 𝑍 + 𝑊𝑏𝑠 𝑍 When X and Y are independent 𝑊𝑏𝑠 𝑌 + 𝑍 = 𝑊𝑏𝑠 𝑌 + 𝑊𝑏𝑠(𝑍) Note when we only know Cov(X,Y)=0 we can’t assume X and Y are independent

  20. Correlation Correlation of X and Y 𝜍 𝑌, 𝑍 = 𝐷𝑝𝑤(𝑌, 𝑍) 𝜏 1 𝜏 2 Note: −1 ≤ 𝜍 𝑌, 𝑍 ≤ 1 Measures the linear relationship between X and Y

  21. Conditional Distribution Conditional PMF for discrete X given Y #(%&',)&*) 𝑄 𝑌 = 𝑦 𝑍 = 𝑧 = #()&*) Conditional Expectation 𝐹 𝑌 𝑍 = 𝑧 = ( 𝑦𝑄(𝑌 = 𝑦|𝑍 = 𝑧) '

  22. Conditional Distribution Law of Total Expectation 𝐹 𝐹 𝑌 𝑍 = ( 𝑄 𝑍 = 𝑧 𝐹 𝑌 𝑍 = 𝑧 = 𝐹[𝑌] * Stay tuned!

  23. General Inference General Inference

  24. General Inference General Inference

  25. General Inference Bayesian Networks

  26. Practice Time • Quiz Logistics and Coverage • Random Variables • Joint Distributions • Joint RV Statistics • Conditional Distribution • General Inference • Practice Problems!

  27. Practice Problems • 500 year flood planes (“a previous exam” on website) • The Huffmeister floodplane in Houston has historically been estimated to flood at an average rate of 1 flood for every 500 years. • What is the probability of observing at least 3 floods in 500 years? • What is the probability that a flood will occur within the next 100 years? • What is the expected number of years until the next flood?

  28. Practice Problems • What is the probability of observing at least 3 floods in 500 years? • Poisson with lambda = 1 (flood per 500 years) • P(X >= 3) = 1 – P(X < 3) = 1 – (sum of P(X=i) from 0 to 2) • 1 – 5/2e • What is the probability that a flood will occur within the next 100 years? • Exponential with lambda = 1/500 • F(100) = 1 – e^(-0.2) • What is the expected number of years until the next flood? • Expectation for an exponential RV is 1/lambda = 500

  29. Practice Problems • Recursive Code Problem Consider the following recursive function

  30. Practice Problems What is E[Y]?

  31. Practice Problems What is E[Y]? First notice Far() calculated based on Near()

  32. Practice Problems Probability for Far() is based on Near(), so calculate E[X] E[X] = 1/4(2 + 4 + E[6 + X] + E[8 + X]) = 1/4(2 + 4 + 6 + E[X] + 8 + E[X]) = 1/4(20 + 2E[X]) = 5 + 1/2E[X] So, E[X] = 10

  33. Practice Problems Now we are ready to calculate E[Y] E[Y] = 1/3(2 + E[2 + X] + E[4 + Y]) = 1/3(2 + 2 + E[X] + 4 + E[Y]) = 1/3(8 + E[X] + E[Y]) = 1/3(8 + 10 + E[Y]) = 18/3 + 1/3E[Y] So, E[Y] = 9

  34. Practice Problems What is Var[Y]?

  35. Practice Problems Calculate E[X^2] E[X^2 ] = 1/4(2^2 + 4^2 + E[(6 + X)^2 ] + E[(8 + X)^2 ] = 1/4(4 + 16 + 36 + 12E[X] + E[X^2 ] + 64 + 16E[X] + E[X^2 ]) = 1/4(120 + 28E[X] + 2E[X^2 ]) = 1/4(120 + 28(10) + 2E[X^2 ]) = 1/4(400 + 2E[X^2 ]) = 100 + 1/2E[X^2 ] So, E[X^2 ] = 2(100) = 200

  36. Practice Problems Calculate E[Y^2] E[Y^2 ] = 1/3(2^2 + E[(2 + X)^ 2 ]+ E[(4 + Y)^2 ] = 1/3(4 + 4 + 4E[X] + E[X^2 ] + 16 + 8E[Y] + E[Y^2 ]) = 1/3(24 + 40 + E[X^2 ] + 8(9) + E[Y^2 ]) = 1/3(136 + 200 + E[Y^2 ]) = 1/3(336 + E[Y^2 ]) So, E[Y^2 ] = 336/2 = 168

  37. Practice Problems Now that we have E[X^2] and E[Y^2], we are ready to calculate Var(Y) Var(Y)= E[Y^2 ] – E[Y]^2 = 168 – (9)^2 = 168 – 81 = 87

  38. Good Luck!!!

Recommend


More recommend