multiple random variables joint probability density
play

Multiple Random Variables Joint Probability Density Let X and Y be - PowerPoint PPT Presentation

Multiple Random Variables Joint Probability Density Let X and Y be two random variables. Their joint distribution ( ) P X x Y y function is F XY x , y . ( ) 1 , < x < , < y <


  1. Multiple Random Variables

  2. Joint Probability Density Let X and Y be two random variables. Their joint distribution ( ) ≡ P X ≤ x ∩ Y ≤ y ⎡ ⎤ function is F XY x , y ⎦ . ⎣ ( ) ≤ 1 , − ∞ < x < ∞ , − ∞ < y < ∞ 0 ≤ F XY x , y ( ) = F XY x , −∞ ( ) = F XY −∞ , y ( ) = 0 F XY −∞ , −∞ ( ) = 1 F XY ∞ , ∞ ( ) does not decrease if either x or y increases or both increase F XY x , y ( ) = F ( ) and F XY x , ∞ ( ) = F X x ( ) F XY ∞ , y Y y

  3. Joint Probability Density Joint distribution function for tossing two dice

  4. Joint Probability Density ( ) ∂ 2 ( ) = ( ) f XY x , y ∂ x ∂ y F XY x , y ( ) ≥ 0 , − ∞ < x < ∞ , − ∞ < y < ∞ f XY x , y ∞ ∞ y x ( ) dx ( ) = ( ) d α ∫ ∫ ∫ ∫ = 1 F XY x , y f XY α , β d β f XY x , y dy −∞ −∞ −∞ −∞ ∞ ∞ ( ) = ( ) dy ( ) = ( ) dx ∫ ∫ f X x f XY x , y and f Y y f XY x , y −∞ −∞ y 2 x 2 ( ) dx ∫ ∫ ⎡ ⎤ P x 1 < X ≤ x 2 , y 1 < Y ≤ y 2 ⎦ = f XY x , y dy ⎣ y 1 x 1 ∞ ∞ ( ) = ( ) ( ) f XY x , y ( ) dx ∫ ∫ E g X , Y g x , y dy −∞ −∞

  5. Combinations of Two Random Variables Example X and Y are independent, identically distributed (i.i.d.) random variables with common PDF ( ) = e − x u x ( ) f Y y ( ) = e − y u y ( ) f X x Find the PDF of Z = X / Y . Since X and Y are never negative, Z is never negative. ( ) = P X / Y ≤ z ( ) = P X ≤ zY ∩ Y > 0 ⎡ ⎤ ⎡ ⎤ ⎡ ⎤ ⎦ ⇒ F Z z ⎦ + P X ≥ zY ∩ Y < 0 F Z z ⎣ ⎣ ⎣ ⎦ ( ) = P X ≤ zY ∩ Y > 0 ⎡ ⎤ Since Y is never negative F Z z ⎣ ⎦

  6. Combinations of Two Random Variables ∞ zy ∞ zy ( ) = ( ) dxdy ∫ ∫ ∫ ∫ e − x e − y dxdy = F Z z f XY x , y −∞ −∞ 0 0 Using Leibnitz’s formula for differentiating an integral, ( ) ( ) ( ) ( ) ( ) ⎡ ⎤ b z b z ∂ g x , z db z da z ( ) − ( ) + d ( ) dx ( ) , z ( ) , z ∫ ∫ ⎢ ⎥ = g x , z g b z g a z dx ∂ z dz dz dz ⎢ ⎥ ( ) ( ) ⎣ ⎦ a z a z ∞ ( ) = ∂ ( ) = ∫ ye − zy e − y dy , z > 0 f Z z ∂ z F Z z 0 ( ) u z ( ) = f Z z ( ) 2 z + 1

  7. Combinations of Two Random Variables

  8. Combinations of Two Random Variables Example The joint PDF of X and Y is defined as ⎧ ) = 6 x , x ≥ 0, y ≥ 0, x + y ≤ 1 ( f XY x , y ⎨ 0 , otherwise ⎩ Define Z = X − Y . Find the PDF of Z .

  9. Combinations of Two Random Variables Given the constraints on X and Y , − 1 ≤ Z ≤ 1. Z = X − Y intersects X + Y = 1 at X = 1 + Z , Y = 1 − Z 2 2 ( ) /2 ( ) /2 1 − z 1 − z 1 − y 1 − y dy ( ) = 1 − ∫ ∫ ∫ ⎡ ⎤ For 0 ≤ z ≤ 1, F Z z = 1 − 3 x 2 6 xdx dy ⎣ ⎦ y + z y + z 0 0 ( ) ⇒ f Z z ( ) = 1 − 3 ( ) 1 − z 2 ( ) = 3 ( ) 1 + 3 z ( ) 4 1 − z 4 1 − z F Z z

  10. Combinations of Two Random Variables For − 1 ≤ z ≤ 0 ( ) /2 ( ) /2 ( ) /2 1 − z 1 − z 1 − z y + z y + z dy 2 dy ( ) = 2 ( ) ∫ ∫ ∫ ∫ ⎡ ⎤ = 6 = 6 y + z x 2 F Z z 6 xdx dy ⎣ ⎦ 0 − z − z − z 0 ( ) ( ) 3 2 1 + z 3 1 + z ( ) = ( ) = ⇒ f Z z F Z z 4 4

  11. Joint Probability Density ⎛ ⎞ ⎛ ⎞ rect x − X 0 y − Y 0 ( ) = 1 Let f XY x , y ⎟ rect ⎜ ⎜ ⎟ w X w Y ⎝ w X ⎠ ⎝ w Y ⎠ ∞ ∞ ( ) = ( ) dx ∫ ∫ = X 0 E X x f XY x , y dy −∞ −∞ ( ) = Y 0 E Y ∞ ∞ ( ) = ( ) dx ∫ ∫ = X 0 Y 0 E XY xy f XY x , y dy −∞ −∞ ⎛ ⎞ ∞ rect x − X 0 ( ) = ( ) dy = 1 ∫ f X x f XY x , y ⎜ ⎟ w X ⎝ w X ⎠ −∞

  12. Joint Probability Density ( ) ∩ A ⎡ ⎤ X ≤ x P ( ) = ⎣ ⎦ Conditional Probability F X | A x ⎡ ⎣ ⎤ P A ⎦ { } Let A = Y ≤ y ( ) ⎡ ⎤ P X ≤ x ∩ Y ≤ y F XY x , y ( ) = ⎣ ⎦ = F X | Y ≤ y x ( ) ⎡ ⎤ P Y ≤ y F Y y ⎣ ⎦ { } Let A = y 1 < Y ≤ y 2 ( ) − F XY x , y 1 ( ) F XY x , y 2 ( ) = F X | y 1 < Y ≤ y 2 x ( ) − F ( ) F Y y 2 Y y 1

  13. Joint Probability Density { } Let A = Y = y ( ) ∂ ( ) ( ) − F XY x , y ( ) ∂ y F XY x , y F XY x , y + Δ y ( ) = lim = F X | Y = y x ( ) − F ( ) ( ) Y y + Δ y d ( ) F Y y Δ y → 0 dy F Y y ( ) ∂ ( ) ( ) ∂ y F XY x , y ( ) = f XY x , y ( ) = ∂ ( ) = ( ) F X | Y = y x , f X | Y = y x ∂ x F X | Y = y x ( ) ( ) f Y y f Y y ( ) f XY x , y ( ) = Similarly f Y | X = x y ( ) f X x

  14. Joint Probability Density ( ) ( ) f XY x , y f XY x , y ( ) = ( ) = In a simplified notation f X | Y x and f Y | X y ( ) ( ) f Y y f X x ( ) f Y y ( ) = f Y | X y ( ) f X x ( ) Bayes’ Theorem f X | Y x Marginal PDF’s from joint or conditional PDF’s ∞ ∞ ( ) = ( ) dy ( ) f Y y ( ) dy ∫ ∫ = f X x f XY x , y f X | Y x −∞ −∞ ∞ ∞ ( ) = ( ) dx ( ) f X x ( ) dx ∫ ∫ = f Y y f XY x , y f Y | X y −∞ −∞

  15. Joint Probability Density Example: Let a message X with a known PDF be corrupted by additive noise N also with known pdf and received as Y = X + N . Then the best estimate that can be made of the message X is the value at the peak of the conditional PDF, ( ) f X x ( ) f Y | X y ( ) = f X | Y x ( ) f Y y

  16. Joint Probability Density Let N have the PDF, Then, for any known value of X , the PDF of Y would be ( ) , the conditional PDF of Y given Therefore if the PDF of N is f N n ( ) X is f N y − X

  17. Joint Probability Density Using Bayes’ theorem, ( ) f X x ( ) ( ) f X x ( ) f N y − x f Y | X y ( ) = = f X | Y x ( ) ( ) f Y y f Y y ( ) f X x ( ) ( ) f X x ( ) f N y − x f N y − x = = ∞ ∞ ( ) f X x ( ) dx ( ) f X x ( ) dx ∫ ∫ f N y − x f Y | X y −∞ −∞ Now the conditional PDF of X given Y can be computed.

  18. Joint Probability Density To make the example concrete let ( ) − x /E X ( ) = e ( ) f N n ( ) = 1 − n 2 /2 σ N 2 f X x u x e ( ) E X σ N 2 π Then the conditional pdf of X given Y is found to be ⎡ ⎤ ⎡ ⎤ ⎛ ⎞ σ N 2 y − σ N 2 y − ⎢ ⎥ exp ⎢ ⎥ ( ) ( ) ( ) ⎜ ⎟ 2E 2 X E X ⎢ ⎥ E X ⎢ ⎥ ( ) = ⎣ ⎦ ⎜ ⎟ 1 + erf f Y y ( ) ⎢ ⎥ ⎜ ⎟ 2E X 2 σ N ⎢ ⎥ ⎜ ⎟ ⎢ ⎥ ⎝ ⎠ ⎣ ⎦ where erf is the error function .

  19. Joint Probability Density

  20. Independent Random Variables If two random variables X and Y are independent then ( ) ( ) f XY x , y f XY x , y ( ) = f X x ( ) = ( ) = f Y y ( ) = f X | Y x and f Y | X y ( ) ( ) f Y y f X x ( ) = f X x ( ) f Y y ( ) and their correlation is the product Therefore f XY x , y of their expected values. ∞ ∞ ∞ ∞ ( ) = ( ) dx ( ) dy ( ) dx ( ) E Y ( ) ∫ ∫ ∫ ∫ = = E X E XY xy f XY x , y dy y f Y y x f X x −∞ −∞ −∞ −∞

  21. Independent Random Variables Covariance ( ) ( ) ⎛ ⎞ * ⎡ ⎤ ⎡ ⎤ σ XY ≡ E X − E X ⎦ Y − E Y ⎣ ⎣ ⎦ ⎝ ⎠ ( ) f XY x , y ∞ ∞ ( ) y * − E Y * ( ) ( ) ( ) dx ∫ ∫ σ XY = x − E X dy −∞ −∞ ( ) − E X ( ) ( ) E Y * σ XY = E XY * ( ) − E X ( ) = 0 ( ) E Y * ( ) E Y * If X and Y are independent, σ XY = E X

  22. Independent Random Variables Correlation Coefficient ( ) ( ) Y * − E Y * ⎛ ⎞ X − E X ρ XY = E × ⎜ ⎟ ⎜ σ X σ Y ⎟ ⎝ ⎠ ( ) ( ) y * − E Y * ⎛ ⎞ ⎛ ⎞ x − E X ∞ ∞ ( ) dx ∫ ∫ ρ XY = ⎜ ⎟ ⎟ f XY x , y dy ⎜ ⎟ σ X ⎜ σ Y ⎝ ⎠ ⎝ ⎠ −∞ −∞ ( ) − E X ( ) ( ) E Y * E XY * = σ XY ρ XY = σ X σ Y σ X σ Y If X and Y are independent ρ = 0. If they are perfectly positively correlated ρ = + 1 and if they are perfectly negatively correlated ρ = − 1.

  23. Independent Random Variables If two random variables are independent, their covariance is zero. However, if two random variables have a zero covariance that does not mean they are necessarily independent. Independence ⇒ Zero Covariance Zero Covariance ⇒ Independence

  24. Independent Random Variables In the traditional jargon of random variable analysis, two “uncorrelated” random variables have a covariance of zero. Unfortunately, this does not also imply that their correlation is zero. If their correlation is zero they are said to be orthogonal . X and Y are "Uncorrelated" ⇒ σ XY = 0 ( ) = 0 X and Y are "Uncorrelated" ⇒ E XY

  25. Independent Random Variables The variance of a sum of random variables X and Y is 2 + σ Y 2 + 2 σ XY = σ X 2 + σ Y 2 + 2 ρ XY σ X σ Y σ X + Y = σ X 2 If Z is a linear combination of random variables X i N ∑ Z = a 0 + a i X i i = 1 N ( ) = a 0 + ( ) ∑ then E Z a i E X i i = 1 N N N N N 2 = ∑ ∑ ∑ ∑ ∑ σ Z a i a j σ X i X j = 2 σ X i + a i a j σ X i X j 2 a i i = 1 j = 1 i = 1 i = 1 j = 1 i ≠ j

  26. Independent Random Variables If the X ’s are all independent of each other, the variance of the linear combination is a linear combination of the variances. N 2 = ∑ σ Z 2 σ X i 2 a i i = 1 If Z is simply the sum of the X ’s, and the X ’s are all independent of each other, then the variance of the sum is the sum of the variances. N 2 = ∑ σ Z σ X i 2 i = 1

Recommend


More recommend