joint distributions often several random variables are
play

joint distributions Often, several random variables are - PowerPoint PPT Presentation

joint distributions Often, several random variables are simultaneously observed X = height and Y = weight X = cholesterol and Y = blood pressure X 1 , X 2 , X 3 = work loads on servers A, B, C Joint probability mass function: f XY (x, y) = P({X


  1. joint distributions Often, several random variables are simultaneously observed X = height and Y = weight X = cholesterol and Y = blood pressure X 1 , X 2 , X 3 = work loads on servers A, B, C Joint probability mass function: f XY (x, y) = P({X = x} & {Y = y}) Joint cumulative distribution function: F XY (x, y) = P({X ≤ x} & {Y ≤ y}) � 31

  2. 
 
 
 
 
 
 examples Two joint PMFs 
 X Y W Z 1 2 3 1 2 3 1 1 2/24 2/24 2/24 4/24 1/24 1/24 2 2 2/24 2/24 2/24 0 3/24 3/24 3 3 2/24 2/24 2/24 0 4/24 2/24 4 4 2/24 2/24 2/24 4/24 0 2/24 P(W = Z) = 3 * 2/24 = 6/24 P(X = Y) = (4 + 3 + 2)/24 = 9/24 Can look at arbitrary relationships among variables this way � 32

  3. 
 
 
 
 
 
 
 
 
 
 marginal distributions Two joint PMFs 
 W Z X Y 1 2 3 1 2 3 f W (w) f X (x) 1 2/24 2/24 2/24 6/24 1 4/24 1/24 1/24 6/24 2 2 2/24 2/24 2/24 6/24 0 3/24 3/24 6/24 3 2/24 2/24 2/24 6/24 3 0 4/24 2/24 6/24 4 2/24 2/24 2/24 6/24 4 4/24 0 2/24 6/24 8/24 8/24 8/24 8/24 8/24 8/24 f Z (z) f Y (y) Question: Are W & Z independent? Are X & Y independent? � 33

  4. 
 
 
 
 
 
 
 
 
 
 marginal distributions Two joint PMFs 
 X Y W Z 1 2 3 1 2 3 f W (w) f X (x) 1 2/24 2/24 2/24 6/24 1 4/24 1/24 1/24 6/24 2 2 2/24 2/24 2/24 6/24 0 3/24 3/24 6/24 3 2/24 2/24 2/24 6/24 3 0 4/24 2/24 6/24 4 4 2/24 2/24 2/24 6/24 4/24 0 2/24 6/24 8/24 8/24 8/24 8/24 8/24 8/24 f Z (z) f Y (y) f Y (y) = Σ x f XY (x,y) Marginal PMF of one r.v.: sum over the other (Law of total probability) f X (x) = Σ y f XY (x,y) Question: Are W & Z independent? Are X & Y independent? � 34

  5. joint, marginals and independence Repeating the Definition: Two random variables X and Y are independent if the events {X=x} and {Y=y} are independent (for any fixed x, y), i.e. ∀ x, y P({X = x} & {Y=y}) = P({X=x}) • P({Y=y}) Equivalent Definition: Two random variables X and Y are independent if their joint probability mass function is the product of their marginal distributions, i.e. ∀ x, y f XY (x,y) = f X (x) • f Y (y) Exercise: Show that this is also true of their cumulative distribution functions � 35

  6. expectation of a function of 2 r.v.’s A function g(X, Y) defines a new random variable. Its expectation is: E [ g(X, Y) ] = Σ x Σ y g(x, y) f XY (x,y) ☜ like slide 17 Expectation is linear. E.g., if g is linear: E [ g(X, Y) ] = E [ a X + b Y + c ] = a E [ X ] + b E [ Y ] + c Example: X Y 1 2 3 g(X, Y) = 2X-Y 1 1 • 4/24 0 • 1/24 -1 • 1/24 2 3 • 0/24 2 • 3/24 1 • 3/24 E[g(X,Y)] = 72/24 = 3 3 5 • 0/24 4 • 4/24 3 • 2/24 E[g(X,Y)] = 2•E[X] - E[Y] 4 7 • 4/24 6 • 0/24 5 • 2/24 = 2•2.5 - 2 = 3 recall both marginals are uniform � 36

  7. bottom row: dependent variables Top row; independent variables (a simple linear dependence) sampling from a joint distribution � 37

  8. another example n = 1000 k = 0 n = 1000 k = 200 n = 1000 k = 400 280 480 380 270 260 460 360 250 Y Y Y 440 240 340 230 420 220 320 220 240 260 280 320 340 360 380 400 420 440 460 480 X X X A Nonlinear Dependence Flip n fair coins 2000 1500 X = #Heads seen in first n/2+k (X-E[X])*(Y-E[Y]) 1000 Y = #Heads seen in last n/2+k 500 0 460 480 500 520 540 � 38 Total # Heads

Recommend


More recommend