probability review for final exam 18 05 spring 2014
play

Probability Review for Final Exam 18.05 Spring 2014 Jeremy Orloff and - PDF document

Probability Review for Final Exam 18.05 Spring 2014 Jeremy Orloff and Jonathan Bloom Unit 1: Probability 1. Sets. 2. Counting. 3. Sample space, outcome, event, probability function. 4. Probability: conditional probability, independence, Bayes


  1. Probability Review for Final Exam 18.05 Spring 2014 Jeremy Orloff and Jonathan Bloom Unit 1: Probability 1. Sets. 2. Counting. 3. Sample space, outcome, event, probability function. 4. Probability: conditional probability, independence, Bayes theorem. 5. Discrete random variables: events, pmf, cdf. 6. Bernoulli( p ), binomial( n , p ), geometric( p ), uniform( n ) 7. E ( X ), Var( X ), σ 8. Continuous random variables: pdf, cdf. 9. uniform( a , b ), exponential( λ ), normal( µ , σ ) 10. Transforming random variables. 11. Quantiles. 12. Central limit theorem, law of large numbers, histograms. 13. Joint distributions: pmf, pdf, cdf, covariance and correlation. Sets and counting • Sets: ∅ , union, intersection, complement Venn diagrams, products • Counting: inclusion­exclusion, rule of product, permutations n P k , combinations n C k = n r k Problem 1. Consider the nucleotides A , G , C , T . (a) How many ways are there to make a sequence of 5 nucleotides. (b) How many sequences of length 5 are there where no adjacent nucleotides are the same (c) How many sequences of length 5 have exactly one A ? Problem 2. (a) How many 5 card poker hands are there? 1

  2. (b) How many ways are there to get a full house (3 of one rank and 2 of another)? (c) What’s the probability of getting a full house? Problem 3. How many arrangements of the letters in the word probability are there? Probability • Sample space, outcome, event, probability function. Rule: P ( A ∪ B ) = P ( A ) + P ( B ) − P ( A ∩ B ). Special case: P ( A c ) = 1 − P ( A ) ( A and B disjoint ⇒ P ( A ∪ B ) = P ( A ) + P ( B ).) • Conditional probability, multiplication rule, trees, law of total probability, in- dependence • Bayes’ theorem, base rate fallacy Problem 4. Let E and F be two events for which one knows that the probability that at least one of them occurs is 3/4. What is the probability that neither E nor F occurs? Problem 5. Let C and D be two events for which one knows that P ( C ) = 0 . 3, P ( D ) = 0 . 4, and P ( C ∩ D ) = 0 . 2. What is P ( C c ∩ D )? Problem 6. We toss a coin three times. For this experiment we choose the sample space Ω = { HHH, T HH, HT H, HHT, T T H, T HT, HT T, T T T } . (a) Write down the set of outcomes corresponding to each of the following events: A = ‘we throw tails exactly two times’ B = ‘we throw tails at least two times’ C = ‘tails did not appear before a head appeared’ D = ‘the first throw results in tails’ (b) Write down the set of outcomes corresponding to each of the following events: A c , A ∪ ( C ∩ D ), and A ∩ D c . Problem 7. Suppose we have 8 teams labeled T 1 , . . . , T 8 . Suppose they are ordered by placing their names in a hat and drawing the names out one at a time. (a) How many ways can it happen that all the odd numbered teams are in the odd numbered slots and all the even numbered teams are in the even numbered slots? 2

  3. (b) What is the probability of this happening? Problem 8. Suppose you want to divide a 52 card deck into four hands with 13 cards each. What is the probability that each hand has a king? Problem 9. A fair die is thrown twice. A is the event ‘sum of throws equals 4,” B is “at least one of the throws is 3.” (a) Calculate P ( A | B ). (b) Are A and B independent? Problem 10. A Dutch cow is tested for BSE (mad cow disease), using a test with P ( T | B ) = 0 . 70 and P ( T | B c ) = 0 . 10. Here T is the event of a positive test and B is the event of having BSE. The risk of BSE is P ( B ) = 1 . 3 × 10 − 5 . Compute P ( B | T ) c ). and P ( B | T Problem 11. A student takes a multiple­choice exam. Suppose for each question he either know the answer or gambles and chooses an option at random. Further suppose that if he knows the answer, the probability of a correct answer is 1, and if he gambles this probability is 1/4. To pass, students need to answer at least 60% of the questions correctly. The student has “studied for a minimal pass,” i.e., with probability 0.6 he knows the answer to a question. Given that he answers a question correctly, what is the probability that he actually knows the answer? Problem 12. Suppose you have an urn containing 7 red and 3 blue balls. You draw three balls at random. On each draw, if the ball is red you set it aside and if the ball is blue you put it back in the urn. What is the probability that the third draw is blue? (If you get a blue ball it counts as a draw even though you put it back in the urn.) Problem 13. Independence Suppose that P ( A ) = 0 . 4 , P ( B ) = 0 . 3 and P (( A ∪ B ) C ) = 0 . 42 . Are A and B independent? Problem 14. Suppose that events A, B and C are mutually independent with P ( A ) = 0 . 3 , P ( B ) = 0 . 4 , P ( C ) = 0 . 5 . Compute the following: (Hint: Use a Venn diagram) (i) P ( A ∩ B ∩ C c ) (ii) P ( A ∩ B c ∩ C ) (iii) P ( A c ∩ B ∩ C ) 3

  4. Problem 15. We choose a month of the year, in such a manner that each month has the same probability. Find out whether the following events are independent: (a) The event ‘outcome is an even numbered month’ and the event ‘outcome is in the first half of the year.’ (b) The event ‘outcome is an even numbered month’ and the event‘outcome is a summer month’ (i.e.,June, July, August). Problem 16. Suppose A and B are events with 0 < P ( A ) < 1 and 0 < P ( B ) < 1. (a) If A and B are disjoint, can they be independent? (b) If A and B are independent, can they be disjoint? (c) If A ⊂ B , can A and B be independent? (d) If A and B are independent, can A and A ∪ B be independent? Random variables, expectation and variance • Discrete random variables: events, pmf, cdf • Bernoulli( p ), binomial( n , p ), geometric( p ), uniform( n ) • E ( X ), meaning, algebraic properties, E ( h ( X )) • Var( X ), meaning, algebraic properties • Continuous random variables: pdf, cdf • uniform( a , b ), exponential( λ ), normal( µ , σ ) • Transforming random variables • Quantiles Problem 17. Directly from the definitions of expected value and variance, compute E ( X ) and Var( X ) when X has probability mass function given by the following table: X ­2 ­1 0 1 2 p(X) 1/15 2/15 3/15 4/15 5/15 Problem 18. Suppose that X takes values between 0 and 1 and has probability density function 2 x . Compute Var( X ) and Var( X 2 ). Problem 19. The pmf of X is given by 1 2 2 P ( X = − 1) = , P ( X = 0) = , P ( X = 1) = . 5 5 5 4

  5. (a) Compute E ( X ). (b) Give the pdf of Y = X 2 and use it to compute E ( Y ). (c) Instead, compute E ( X 2 ) directly from an extended table. (d) Determine Var( X ). Problem 20. For a certain random variable X it is known that E ( X ) = 2 and Var( X ) = 3. What is E ( X 2 )? Problem 21. Determine the expectation and variance of a Bernoulli( p ) random variable. Problem 22. Suppose 100 people all toss a hat into a box and then proceed to randomly pick out a hat. What is the expected number of people to get their own hat back. Hint: express the number of people who get their own hat as a sum of random variables whose expected value is easy to compute. pmf, pdf, cdf Probability Mass Functions, Probability Density Functions and Cumulative Distribu- tion Functions Problem 23. Suppose that X ∼ Bin( n, 0 . 5) . Find the probability mass function of Y = 2 X. Problem 24. Suppose that X is uniform on [0 , 1] . Compute the pdf and cdf of X. If Y = 2 X + 5 , compute the pdf and cdf of Y. Problem 25. Now suppose that X has probability density function f X ( x ) = λ e − λx for x ≥ 0 . Compute the cdf, F X ( x ) . If Y = X 2 , compute the pdf and cdf of Y. Problem 26. Suppose that X is a random variable that takes on values 0, 2 and 3 with probabilities 0.3, 0.1, 0.6 respectively. Let Y = 3( X − 1) 2 . (a) What is the expectation of X ? (b) What is the variance of X ? (c) What is the expection of Y ? (d) Let F Y ( t ) be the cumulative density function of Y . What is F Y (7)? Problem 27. Suppose you roll a fair 6­sided die 25 times (independently), and you get $3 every time you roll a 6. Let X be the total number of dollars you win. 5

  6. (a) What is the pmf of X . (b) Find E ( X ) and Var( X ). (c) Let Y be the total won on another 25 independent rolls. Compute and compare E ( X + Y ), E (2 X ), Var( X + Y ), Var(2 X ). Explain briefly why this makes sense. Problem 28. A continuous random variable X has PDF f ( x ) = x + ax 2 on [0,1] Find a , the CDF and P ( . 5 < X < 1). Problem 29. ( PMF of a sum ) Let X and Y be two independent random variables, where X ∼ Ber( p ) and Y ∼ Ber( q ). When p = q = r , we know that X + Y has a bin(2 , r ) distribution. Suppose p = 1 / 2 and q = 1 / 4. Determine P ( X + Y = k ), for k = 0 , 1 , 2 , and conclude that X + Y does not have a binomial distribution. Problem 30. Let X be a discrete random variable with pmf p given by: − 1 a 0 1 2 1 1 1 1 p ( a ) 4 8 8 2 (a) Let Y = X 2 . Calculate the pmf of Y . (b) Calculate the value of the cdf’s of X and Y at a = 1, a = 3 / 4, and a = π − 3. Problem 31. Suppose that the cdf of X is given by: ⎧ 0 for a < 0 ⎪ ⎪ 1 for 0 ≤ a < 1 ⎨ F ( a ) = 3 2 1 for 1 ≤ a < 3 2 2 4 ⎪ ⎪ 1 for a ≥ 3 4 . ⎩ Determine the pmf of X . Problem 32. For each of the following say whether it can be the graph of a cdf. If it can be, say whether the variable is discrete or continuous. (i) (ii) F ( x ) F ( x ) 1 1 0.5 0.5 x x 6

Recommend


More recommend