Lecture 7. Conditional Distributions with Applications Igor Rychlik Chalmers Department of Mathematical Sciences Probability, Statistics and Risk, MVE300 • Chalmers • April 2013
Random variables: ◮ Joint distribution of X; Y . ◮ Dependent random variables: ◮ correlated normal variables, ◮ expectation of h(X; Y ), covariance. ◮ Conditional pdf and cdf. ◮ Law of total probabilities. ◮ Bayes formula.
Joint probability distribution function of X , Y : Example Experiment: select at random a person in the classroom and measure his (her) length x [m] and weight y [kg]. Such an experiment results in two r.v. X ; Y . ◮ Joint distribution of X ; Y is a function F XY ( x , y ) = P( X ≤ x and Y ≤ y ) = P( X ≤ x , Y ≤ y ) 1 . ◮ X , Y are independent if F XY ( x , y ) = F X ( x ) F Y ( y ) (1) ◮ if X , Y are independent then any statement A about X is independent of a statement B about Y , i.e. P( A ∩ B ) = P( A )P( B ) 1 Similarly as for one dimensional case, the probability of any statement about the random variables X , Y is computable (at least in theory) when F XY ( x , y ) is known.
7 7 Wave data from North Sea. Scatter 6 Resampled crest amplitude (m) 6 plot of crest period and crest amplitude (left); crest period T c and crest Crest amplitude (m) 5 5 amplitude A c , resampled from original 4 4 data (right). 3 3 2 2 Are T c , A c independent? 1 1 Very unlikely! 0 0 0 5 10 0 5 10 Crest period (s) Resampled crest period (s) There were n = 199 waves measured. In order to get independent observations of T c , A c we choose 100 waves at random out of 199. Next we split the data in four groups defined by events A = T c ≤ 1, B = A c ≤ 2 and let p = P( A ) and q = P( B ). Data: B c B 2 A 16 2 A c 49 33 2 If T c and A c are independent then probabilities of four events AB , A c B , AB c and A c B c are defined by parameters p , q . The estimates are p ∗ = 0 . 18, q ∗ = 0 . 65. Now we can use χ 2 test to test hypothesis of independence, see blackboard.
Recommend
More recommend