cs70 jean walrand lecture 36
play

CS70: Jean Walrand: Lecture 36. Continuous Probability 3 CS70: Jean - PowerPoint PPT Presentation

CS70: Jean Walrand: Lecture 36. Continuous Probability 3 CS70: Jean Walrand: Lecture 36. Continuous Probability 3 CS70: Jean Walrand: Lecture 36. Continuous Probability 3 1. Review: CDF , PDF 2. Review: Expectation 3. Review: Independence


  1. Independent Continuous Random Variables Definition: The continuous RVs X and Y are independent if Pr [ X ∈ A , Y ∈ B ] = Pr [ X ∈ A ] Pr [ Y ∈ B ] , ∀ A , B . Theorem: The continuous RVs X and Y are independent if and only if

  2. Independent Continuous Random Variables Definition: The continuous RVs X and Y are independent if Pr [ X ∈ A , Y ∈ B ] = Pr [ X ∈ A ] Pr [ Y ∈ B ] , ∀ A , B . Theorem: The continuous RVs X and Y are independent if and only if f X , Y ( x , y ) = f X ( x ) f Y ( y ) .

  3. Independent Continuous Random Variables Definition: The continuous RVs X and Y are independent if Pr [ X ∈ A , Y ∈ B ] = Pr [ X ∈ A ] Pr [ Y ∈ B ] , ∀ A , B . Theorem: The continuous RVs X and Y are independent if and only if f X , Y ( x , y ) = f X ( x ) f Y ( y ) . Proof:

  4. Independent Continuous Random Variables Definition: The continuous RVs X and Y are independent if Pr [ X ∈ A , Y ∈ B ] = Pr [ X ∈ A ] Pr [ Y ∈ B ] , ∀ A , B . Theorem: The continuous RVs X and Y are independent if and only if f X , Y ( x , y ) = f X ( x ) f Y ( y ) . Proof: As in the discrete case.

  5. Independent Continuous Random Variables Definition: The continuous RVs X and Y are independent if Pr [ X ∈ A , Y ∈ B ] = Pr [ X ∈ A ] Pr [ Y ∈ B ] , ∀ A , B . Theorem: The continuous RVs X and Y are independent if and only if f X , Y ( x , y ) = f X ( x ) f Y ( y ) . Proof: As in the discrete case. Definition:

  6. Independent Continuous Random Variables Definition: The continuous RVs X and Y are independent if Pr [ X ∈ A , Y ∈ B ] = Pr [ X ∈ A ] Pr [ Y ∈ B ] , ∀ A , B . Theorem: The continuous RVs X and Y are independent if and only if f X , Y ( x , y ) = f X ( x ) f Y ( y ) . Proof: As in the discrete case. Definition: The continuous RVs X 1 ,..., X n are mutually independent if

  7. Independent Continuous Random Variables Definition: The continuous RVs X and Y are independent if Pr [ X ∈ A , Y ∈ B ] = Pr [ X ∈ A ] Pr [ Y ∈ B ] , ∀ A , B . Theorem: The continuous RVs X and Y are independent if and only if f X , Y ( x , y ) = f X ( x ) f Y ( y ) . Proof: As in the discrete case. Definition: The continuous RVs X 1 ,..., X n are mutually independent if Pr [ X 1 ∈ A 1 ,..., X n ∈ A n ] = Pr [ X 1 ∈ A 1 ] ··· Pr [ X n ∈ A n ] , ∀ A 1 ,..., A n .

  8. Independent Continuous Random Variables Definition: The continuous RVs X and Y are independent if Pr [ X ∈ A , Y ∈ B ] = Pr [ X ∈ A ] Pr [ Y ∈ B ] , ∀ A , B . Theorem: The continuous RVs X and Y are independent if and only if f X , Y ( x , y ) = f X ( x ) f Y ( y ) . Proof: As in the discrete case. Definition: The continuous RVs X 1 ,..., X n are mutually independent if Pr [ X 1 ∈ A 1 ,..., X n ∈ A n ] = Pr [ X 1 ∈ A 1 ] ··· Pr [ X n ∈ A n ] , ∀ A 1 ,..., A n . Theorem:

  9. Independent Continuous Random Variables Definition: The continuous RVs X and Y are independent if Pr [ X ∈ A , Y ∈ B ] = Pr [ X ∈ A ] Pr [ Y ∈ B ] , ∀ A , B . Theorem: The continuous RVs X and Y are independent if and only if f X , Y ( x , y ) = f X ( x ) f Y ( y ) . Proof: As in the discrete case. Definition: The continuous RVs X 1 ,..., X n are mutually independent if Pr [ X 1 ∈ A 1 ,..., X n ∈ A n ] = Pr [ X 1 ∈ A 1 ] ··· Pr [ X n ∈ A n ] , ∀ A 1 ,..., A n . Theorem: The continuous RVs X 1 ,..., X n are mutually independent if and only if

  10. Independent Continuous Random Variables Definition: The continuous RVs X and Y are independent if Pr [ X ∈ A , Y ∈ B ] = Pr [ X ∈ A ] Pr [ Y ∈ B ] , ∀ A , B . Theorem: The continuous RVs X and Y are independent if and only if f X , Y ( x , y ) = f X ( x ) f Y ( y ) . Proof: As in the discrete case. Definition: The continuous RVs X 1 ,..., X n are mutually independent if Pr [ X 1 ∈ A 1 ,..., X n ∈ A n ] = Pr [ X 1 ∈ A 1 ] ··· Pr [ X n ∈ A n ] , ∀ A 1 ,..., A n . Theorem: The continuous RVs X 1 ,..., X n are mutually independent if and only if f X ( x 1 ,..., x n ) = f X 1 ( x 1 ) ··· f X n ( x n ) .

  11. Independent Continuous Random Variables Definition: The continuous RVs X and Y are independent if Pr [ X ∈ A , Y ∈ B ] = Pr [ X ∈ A ] Pr [ Y ∈ B ] , ∀ A , B . Theorem: The continuous RVs X and Y are independent if and only if f X , Y ( x , y ) = f X ( x ) f Y ( y ) . Proof: As in the discrete case. Definition: The continuous RVs X 1 ,..., X n are mutually independent if Pr [ X 1 ∈ A 1 ,..., X n ∈ A n ] = Pr [ X 1 ∈ A 1 ] ··· Pr [ X n ∈ A n ] , ∀ A 1 ,..., A n . Theorem: The continuous RVs X 1 ,..., X n are mutually independent if and only if f X ( x 1 ,..., x n ) = f X 1 ( x 1 ) ··· f X n ( x n ) . Proof:

  12. Independent Continuous Random Variables Definition: The continuous RVs X and Y are independent if Pr [ X ∈ A , Y ∈ B ] = Pr [ X ∈ A ] Pr [ Y ∈ B ] , ∀ A , B . Theorem: The continuous RVs X and Y are independent if and only if f X , Y ( x , y ) = f X ( x ) f Y ( y ) . Proof: As in the discrete case. Definition: The continuous RVs X 1 ,..., X n are mutually independent if Pr [ X 1 ∈ A 1 ,..., X n ∈ A n ] = Pr [ X 1 ∈ A 1 ] ··· Pr [ X n ∈ A n ] , ∀ A 1 ,..., A n . Theorem: The continuous RVs X 1 ,..., X n are mutually independent if and only if f X ( x 1 ,..., x n ) = f X 1 ( x 1 ) ··· f X n ( x n ) . Proof: As in the discrete case.

  13. Meeting at a Restaurant

  14. Meeting at a Restaurant Two friends go to a restaurant independently uniformly at random between noon and 1pm.

  15. Meeting at a Restaurant Two friends go to a restaurant independently uniformly at random between noon and 1pm. They agree they will wait for 10 minutes.

  16. Meeting at a Restaurant Two friends go to a restaurant independently uniformly at random between noon and 1pm. They agree they will wait for 10 minutes. What is the probability they meet?

  17. Meeting at a Restaurant Two friends go to a restaurant independently uniformly at random between noon and 1pm. They agree they will wait for 10 minutes. What is the probability they meet?

  18. Meeting at a Restaurant Two friends go to a restaurant independently uniformly at random between noon and 1pm. They agree they will wait for 10 minutes. What is the probability they meet? Here, ( X , Y ) are the times when the friends reach the restaurant.

  19. Meeting at a Restaurant Two friends go to a restaurant independently uniformly at random between noon and 1pm. They agree they will wait for 10 minutes. What is the probability they meet? Here, ( X , Y ) are the times when the friends reach the restaurant. The shaded area are the pairs where | X − Y | < 1 / 6,

  20. Meeting at a Restaurant Two friends go to a restaurant independently uniformly at random between noon and 1pm. They agree they will wait for 10 minutes. What is the probability they meet? Here, ( X , Y ) are the times when the friends reach the restaurant. The shaded area are the pairs where | X − Y | < 1 / 6, i.e., such that they meet.

  21. Meeting at a Restaurant Two friends go to a restaurant independently uniformly at random between noon and 1pm. They agree they will wait for 10 minutes. What is the probability they meet? Here, ( X , Y ) are the times when the friends reach the restaurant. The shaded area are the pairs where | X − Y | < 1 / 6, i.e., such that they meet. The complement is the sum of two rectangles.

  22. Meeting at a Restaurant Two friends go to a restaurant independently uniformly at random between noon and 1pm. They agree they will wait for 10 minutes. What is the probability they meet? Here, ( X , Y ) are the times when the friends reach the restaurant. The shaded area are the pairs where | X − Y | < 1 / 6, i.e., such that they meet. The complement is the sum of two rectangles. When you put them together, they form a square

  23. Meeting at a Restaurant Two friends go to a restaurant independently uniformly at random between noon and 1pm. They agree they will wait for 10 minutes. What is the probability they meet? Here, ( X , Y ) are the times when the friends reach the restaurant. The shaded area are the pairs where | X − Y | < 1 / 6, i.e., such that they meet. The complement is the sum of two rectangles. When you put them together, they form a square with sides 5 / 6.

  24. Meeting at a Restaurant Two friends go to a restaurant independently uniformly at random between noon and 1pm. They agree they will wait for 10 minutes. What is the probability they meet? Here, ( X , Y ) are the times when the friends reach the restaurant. The shaded area are the pairs where | X − Y | < 1 / 6, i.e., such that they meet. The complement is the sum of two rectangles. When you put them together, they form a square with sides 5 / 6. 6 ) 2 = Thus, Pr [ meet ] = 1 − ( 5

  25. Meeting at a Restaurant Two friends go to a restaurant independently uniformly at random between noon and 1pm. They agree they will wait for 10 minutes. What is the probability they meet? Here, ( X , Y ) are the times when the friends reach the restaurant. The shaded area are the pairs where | X − Y | < 1 / 6, i.e., such that they meet. The complement is the sum of two rectangles. When you put them together, they form a square with sides 5 / 6. 6 ) 2 = 11 Thus, Pr [ meet ] = 1 − ( 5 36 .

  26. Breaking a Stick

  27. Breaking a Stick You break a stick at two points chosen independently uniformly at random.

  28. Breaking a Stick You break a stick at two points chosen independently uniformly at random. What is the probability you can make a triangle with the three pieces?

  29. Breaking a Stick You break a stick at two points chosen independently uniformly at random. What is the probability you can make a triangle with the three pieces?

  30. Breaking a Stick You break a stick at two points chosen independently uniformly at random. What is the probability you can make a triangle with the three pieces? Let X , Y be the two break points along the [ 0 , 1 ] stick.

  31. Breaking a Stick You break a stick at two points chosen independently uniformly at random. What is the probability you can make a triangle with the three pieces? Let X , Y be the two break points along the [ 0 , 1 ] stick. You can make a triangle if A < B + C , B < A + C , and C < A + B .

  32. Breaking a Stick You break a stick at two points chosen independently uniformly at random. What is the probability you can make a triangle with the three pieces? Let X , Y be the two break points along the [ 0 , 1 ] stick. You can make a triangle if A < B + C , B < A + C , and C < A + B . If X < Y , this means X < 0 . 5 , Y < X + 0 . 5 , Y > 0 . 5.

  33. Breaking a Stick You break a stick at two points chosen independently uniformly at random. What is the probability you can make a triangle with the three pieces? Let X , Y be the two break points along the [ 0 , 1 ] stick. You can make a triangle if A < B + C , B < A + C , and C < A + B . If X < Y , this means X < 0 . 5 , Y < X + 0 . 5 , Y > 0 . 5. This is the blue triangle.

  34. Breaking a Stick You break a stick at two points chosen independently uniformly at random. What is the probability you can make a triangle with the three pieces? Let X , Y be the two break points along the [ 0 , 1 ] stick. You can make a triangle if A < B + C , B < A + C , and C < A + B . If X < Y , this means X < 0 . 5 , Y < X + 0 . 5 , Y > 0 . 5. This is the blue triangle. If X > Y , we get the red triangle, by symmetry.

  35. Breaking a Stick You break a stick at two points chosen independently uniformly at random. What is the probability you can make a triangle with the three pieces? Let X , Y be the two break points along the [ 0 , 1 ] stick. You can make a triangle if A < B + C , B < A + C , and C < A + B . If X < Y , this means X < 0 . 5 , Y < X + 0 . 5 , Y > 0 . 5. This is the blue triangle. If X > Y , we get the red triangle, by symmetry. Thus, Pr [ make triangle ] = 1 / 4 .

  36. Maximum of Two Exponentials

  37. Maximum of Two Exponentials Let X = Expo ( λ ) and Y = Expo ( µ ) be independent.

  38. Maximum of Two Exponentials Let X = Expo ( λ ) and Y = Expo ( µ ) be independent. Define Z = max { X , Y } .

  39. Maximum of Two Exponentials Let X = Expo ( λ ) and Y = Expo ( µ ) be independent. Define Z = max { X , Y } . Calculate E [ Z ] .

  40. Maximum of Two Exponentials Let X = Expo ( λ ) and Y = Expo ( µ ) be independent. Define Z = max { X , Y } . Calculate E [ Z ] . We compute f Z , then integrate.

  41. Maximum of Two Exponentials Let X = Expo ( λ ) and Y = Expo ( µ ) be independent. Define Z = max { X , Y } . Calculate E [ Z ] . We compute f Z , then integrate. One has

  42. Maximum of Two Exponentials Let X = Expo ( λ ) and Y = Expo ( µ ) be independent. Define Z = max { X , Y } . Calculate E [ Z ] . We compute f Z , then integrate. One has Pr [ Z < z ] = Pr [ X < z , Y < z ]

  43. Maximum of Two Exponentials Let X = Expo ( λ ) and Y = Expo ( µ ) be independent. Define Z = max { X , Y } . Calculate E [ Z ] . We compute f Z , then integrate. One has Pr [ Z < z ] = Pr [ X < z , Y < z ] = Pr [ X < z ] Pr [ Y < z ]

  44. Maximum of Two Exponentials Let X = Expo ( λ ) and Y = Expo ( µ ) be independent. Define Z = max { X , Y } . Calculate E [ Z ] . We compute f Z , then integrate. One has Pr [ Z < z ] = Pr [ X < z , Y < z ] = Pr [ X < z ] Pr [ Y < z ] ( 1 − e − λ z )( 1 − e − µ z ) = =

  45. Maximum of Two Exponentials Let X = Expo ( λ ) and Y = Expo ( µ ) be independent. Define Z = max { X , Y } . Calculate E [ Z ] . We compute f Z , then integrate. One has Pr [ Z < z ] = Pr [ X < z , Y < z ] = Pr [ X < z ] Pr [ Y < z ] ( 1 − e − λ z )( 1 − e − µ z ) = 1 − e − λ z − e − µ z + e − ( λ + µ ) z =

  46. Maximum of Two Exponentials Let X = Expo ( λ ) and Y = Expo ( µ ) be independent. Define Z = max { X , Y } . Calculate E [ Z ] . We compute f Z , then integrate. One has Pr [ Z < z ] = Pr [ X < z , Y < z ] = Pr [ X < z ] Pr [ Y < z ] ( 1 − e − λ z )( 1 − e − µ z ) = 1 − e − λ z − e − µ z + e − ( λ + µ ) z = Thus, f Z ( z ) = λ e − λ z + µ e − µ z − ( λ + µ ) e − ( λ + µ ) z , ∀ z > 0 .

  47. Maximum of Two Exponentials Let X = Expo ( λ ) and Y = Expo ( µ ) be independent. Define Z = max { X , Y } . Calculate E [ Z ] . We compute f Z , then integrate. One has Pr [ Z < z ] = Pr [ X < z , Y < z ] = Pr [ X < z ] Pr [ Y < z ] ( 1 − e − λ z )( 1 − e − µ z ) = 1 − e − λ z − e − µ z + e − ( λ + µ ) z = Thus, f Z ( z ) = λ e − λ z + µ e − µ z − ( λ + µ ) e − ( λ + µ ) z , ∀ z > 0 . Hence, � ∞ E [ Z ] = 0 zf Z ( z ) dz =

  48. Maximum of Two Exponentials Let X = Expo ( λ ) and Y = Expo ( µ ) be independent. Define Z = max { X , Y } . Calculate E [ Z ] . We compute f Z , then integrate. One has Pr [ Z < z ] = Pr [ X < z , Y < z ] = Pr [ X < z ] Pr [ Y < z ] ( 1 − e − λ z )( 1 − e − µ z ) = 1 − e − λ z − e − µ z + e − ( λ + µ ) z = Thus, f Z ( z ) = λ e − λ z + µ e − µ z − ( λ + µ ) e − ( λ + µ ) z , ∀ z > 0 . Hence, � ∞ 0 zf Z ( z ) dz = 1 λ + 1 1 E [ Z ] = µ − λ + µ .

  49. Maximum of Two Exponentials Let X = Expo ( λ ) and Y = Expo ( µ ) be independent. Define Z = max { X , Y } . Calculate E [ Z ] . We compute f Z , then integrate. One has Pr [ Z < z ] = Pr [ X < z , Y < z ] = Pr [ X < z ] Pr [ Y < z ] ( 1 − e − λ z )( 1 − e − µ z ) = 1 − e − λ z − e − µ z + e − ( λ + µ ) z = Thus, f Z ( z ) = λ e − λ z + µ e − µ z − ( λ + µ ) e − ( λ + µ ) z , ∀ z > 0 . Hence, � ∞ 0 zf Z ( z ) dz = 1 λ + 1 1 E [ Z ] = µ − λ + µ .

  50. Maximum of n i.i.d. Exponentials

  51. Maximum of n i.i.d. Exponentials Let X 1 ,..., X n be i.i.d. Expo ( 1 ) .

  52. Maximum of n i.i.d. Exponentials Let X 1 ,..., X n be i.i.d. Expo ( 1 ) . Define Z = max { X 1 , X 2 ,..., X n } .

  53. Maximum of n i.i.d. Exponentials Let X 1 ,..., X n be i.i.d. Expo ( 1 ) . Define Z = max { X 1 , X 2 ,..., X n } . Calculate E [ Z ] .

  54. Maximum of n i.i.d. Exponentials Let X 1 ,..., X n be i.i.d. Expo ( 1 ) . Define Z = max { X 1 , X 2 ,..., X n } . Calculate E [ Z ] . We use a recursion.

  55. Maximum of n i.i.d. Exponentials Let X 1 ,..., X n be i.i.d. Expo ( 1 ) . Define Z = max { X 1 , X 2 ,..., X n } . Calculate E [ Z ] . We use a recursion. The key idea is as follows:

  56. Maximum of n i.i.d. Exponentials Let X 1 ,..., X n be i.i.d. Expo ( 1 ) . Define Z = max { X 1 , X 2 ,..., X n } . Calculate E [ Z ] . We use a recursion. The key idea is as follows: Z = min { X 1 ,..., X n } + V where V is the maximum of n − 1 i.i.d. Expo ( 1 ) .

  57. Maximum of n i.i.d. Exponentials Let X 1 ,..., X n be i.i.d. Expo ( 1 ) . Define Z = max { X 1 , X 2 ,..., X n } . Calculate E [ Z ] . We use a recursion. The key idea is as follows: Z = min { X 1 ,..., X n } + V where V is the maximum of n − 1 i.i.d. Expo ( 1 ) . This follows from the memoryless property of the exponential.

  58. Maximum of n i.i.d. Exponentials Let X 1 ,..., X n be i.i.d. Expo ( 1 ) . Define Z = max { X 1 , X 2 ,..., X n } . Calculate E [ Z ] . We use a recursion. The key idea is as follows: Z = min { X 1 ,..., X n } + V where V is the maximum of n − 1 i.i.d. Expo ( 1 ) . This follows from the memoryless property of the exponential. Let then A n = E [ Z ] .

  59. Maximum of n i.i.d. Exponentials Let X 1 ,..., X n be i.i.d. Expo ( 1 ) . Define Z = max { X 1 , X 2 ,..., X n } . Calculate E [ Z ] . We use a recursion. The key idea is as follows: Z = min { X 1 ,..., X n } + V where V is the maximum of n − 1 i.i.d. Expo ( 1 ) . This follows from the memoryless property of the exponential. Let then A n = E [ Z ] . We see that = E [ min { X 1 ,..., X n } ]+ A n − 1 A n

  60. Maximum of n i.i.d. Exponentials Let X 1 ,..., X n be i.i.d. Expo ( 1 ) . Define Z = max { X 1 , X 2 ,..., X n } . Calculate E [ Z ] . We use a recursion. The key idea is as follows: Z = min { X 1 ,..., X n } + V where V is the maximum of n − 1 i.i.d. Expo ( 1 ) . This follows from the memoryless property of the exponential. Let then A n = E [ Z ] . We see that = E [ min { X 1 ,..., X n } ]+ A n − 1 A n 1 = n + A n − 1

  61. Maximum of n i.i.d. Exponentials Let X 1 ,..., X n be i.i.d. Expo ( 1 ) . Define Z = max { X 1 , X 2 ,..., X n } . Calculate E [ Z ] . We use a recursion. The key idea is as follows: Z = min { X 1 ,..., X n } + V where V is the maximum of n − 1 i.i.d. Expo ( 1 ) . This follows from the memoryless property of the exponential. Let then A n = E [ Z ] . We see that = E [ min { X 1 ,..., X n } ]+ A n − 1 A n 1 = n + A n − 1 because the minimum of Expo is Expo with the sum of the rates.

  62. Maximum of n i.i.d. Exponentials Let X 1 ,..., X n be i.i.d. Expo ( 1 ) . Define Z = max { X 1 , X 2 ,..., X n } . Calculate E [ Z ] . We use a recursion. The key idea is as follows: Z = min { X 1 ,..., X n } + V where V is the maximum of n − 1 i.i.d. Expo ( 1 ) . This follows from the memoryless property of the exponential. Let then A n = E [ Z ] . We see that = E [ min { X 1 ,..., X n } ]+ A n − 1 A n 1 = n + A n − 1 because the minimum of Expo is Expo with the sum of the rates. Hence, E [ Z ] = A n = 1 + 1 2 + ··· + 1 n = H ( n ) .

  63. Quantization Noise

  64. Quantization Noise In digital video and audio, one represents a continuous value by a finite number of bits.

  65. Quantization Noise In digital video and audio, one represents a continuous value by a finite number of bits. This introduces an error

Recommend


More recommend