Motivation III Look at intervals instead of specific times. Probability that you come in between 14:00 and 14:10? 1. Probability that you come in between 14:00 and 14:05? 1/2. Probability that you come between 14:03 and 14:04? 1/10. 1/k. 5 1.0 0.8 0.6 0.4 0.2 0.0 Probability that you come in some time interval of 10 / k minutes?
Motivation III Look at intervals instead of specific times. Probability that you come in between 14:00 and 14:10? 1. Probability that you come in between 14:00 and 14:05? 1/2. Probability that you come between 14:03 and 14:04? 1/10. 1/k. 5 1.0 0.8 0.6 0.4 0.2 0.0 Probability that you come in some time interval of 10 / k minutes?
What do we do so that this doesn’t disappear? If we split our sample PDF (no, not the file format) space into k pieces - multiply each one by k . 6 What happens when you take k → ∞ ? Probability goes to 0.
PDF (no, not the file format) space into k pieces - multiply each one by k . 6 What happens when you take k → ∞ ? Probability goes to 0. What do we do so that this doesn’t disappear? If we split our sample
PDF (no, not the file format) space into k pieces - multiply each one by k . 6 What happens when you take k → ∞ ? Probability goes to 0. What do we do so that this doesn’t disappear? If we split our sample 1.0 0.8 0.6 0.4 0.2 0.0
PDF (no, not the file format) space into k pieces - multiply each one by k . 6 What happens when you take k → ∞ ? Probability goes to 0. What do we do so that this doesn’t disappear? If we split our sample 1.0 0.8 0.6 0.4 0.2 0.0
PDF (no, not the file format) space into k pieces - multiply each one by k . 6 What happens when you take k → ∞ ? Probability goes to 0. What do we do so that this doesn’t disappear? If we split our sample 1.0 0.8 0.6 0.4 0.2 0.0
PDF (no, not the file format) space into k pieces - multiply each one by k . 6 What happens when you take k → ∞ ? Probability goes to 0. What do we do so that this doesn’t disappear? If we split our sample 1.0 0.8 0.6 0.4 0.2 0.0
PDF (no, not the file format) space into k pieces - multiply each one by k . 6 What happens when you take k → ∞ ? Probability goes to 0. What do we do so that this doesn’t disappear? If we split our sample 1.0 0.8 0.6 0.4 0.2 0.0
PDF (no, not the file format) space into k pieces - multiply each one by k . (PDF). 6 What happens when you take k → ∞ ? Probability goes to 0. What do we do so that this doesn’t disappear? If we split our sample 1.0 0.8 0.6 0.4 0.2 0.0 The resulting curve as k → ∞ is the probability density function
f X t f X t dt f X t dt Formally speaking... b 1 Total probability is 1: f is nonnegative (negative probability doesn’t make much sense). a Pr X a b Another way of looking at it: t t Pr X 0 lim 7 PDF f X ( t ) of a random variable X is defined so that the probability of X taking on a value in [ t , t + δ ] is δ f ( t ) for infinitesimally small δ .
f X t dt f X t dt Formally speaking... b 1 Total probability is 1: f is nonnegative (negative probability doesn’t make much sense). a a b Pr X Another way of looking at it: 7 PDF f X ( t ) of a random variable X is defined so that the probability of X taking on a value in [ t , t + δ ] is δ f ( t ) for infinitesimally small δ . Pr [ X ∈ [ t , t + δ ]] f X ( t ) = lim δ δ → 0
f X t dt Formally speaking... a 1 Total probability is 1: f is nonnegative (negative probability doesn’t make much sense). 7 Another way of looking at it: PDF f X ( t ) of a random variable X is defined so that the probability of X taking on a value in [ t , t + δ ] is δ f ( t ) for infinitesimally small δ . Pr [ X ∈ [ t , t + δ ]] f X ( t ) = lim δ δ → 0 ∫ b Pr [ X ∈ [ a , b ]] = f X ( t ) dt
f X t dt Formally speaking... a 1 Total probability is 1: f is nonnegative (negative probability doesn’t make much sense). 7 Another way of looking at it: PDF f X ( t ) of a random variable X is defined so that the probability of X taking on a value in [ t , t + δ ] is δ f ( t ) for infinitesimally small δ . Pr [ X ∈ [ t , t + δ ]] f X ( t ) = lim δ δ → 0 ∫ b Pr [ X ∈ [ a , b ]] = f X ( t ) dt
Formally speaking... Another way of looking at it: Total probability is 1: f is nonnegative (negative probability doesn’t make much sense). a 7 PDF f X ( t ) of a random variable X is defined so that the probability of X taking on a value in [ t , t + δ ] is δ f ( t ) for infinitesimally small δ . Pr [ X ∈ [ t , t + δ ]] f X ( t ) = lim δ δ → 0 ∫ b Pr [ X ∈ [ a , b ]] = f X ( t ) dt ∫ ∞ −∞ f X ( t ) dt = 1
F X t f X z dz F X b F X a F X t F X t F X t 1 t lim 0 t lim 0 1 CDF a Pr X b Pr X a b Pr X t Or, in terms of PDF... 8 Cumulative distribution function (CDF): F X ( t ) = Pr [ X ≤ t ] .
F X t f X z dz F X b F X a F X t F X t F X t 1 t lim 0 t lim 0 1 CDF a Pr X b Pr X a b Pr X t Or, in terms of PDF... 8 Cumulative distribution function (CDF): F X ( t ) = Pr [ X ≤ t ] .
F X b F X a F X t F X t F X t 1 t lim 0 t lim 0 1 CDF a Pr X b Pr X a b Pr X Or, in terms of PDF... 8 Cumulative distribution function (CDF): F X ( t ) = Pr [ X ≤ t ] . ∫ t F X ( t ) = f X ( z ) dz −∞
F X b F X a F X t F X t F X t 1 t lim 0 t lim 0 1 CDF a Pr X b Pr X Or, in terms of PDF... 8 Cumulative distribution function (CDF): F X ( t ) = Pr [ X ≤ t ] . ∫ t F X ( t ) = f X ( z ) dz −∞ Pr [ X ∈ ( a , b ]] =
F X b F X a F X t F X t F X t 1 t lim 0 t lim 0 1 CDF Or, in terms of PDF... 8 Cumulative distribution function (CDF): F X ( t ) = Pr [ X ≤ t ] . ∫ t F X ( t ) = f X ( z ) dz −∞ Pr [ X ∈ ( a , b ]] = Pr [ X ≤ b ] − Pr [ X ≤ a ]
F X t F X t F X t CDF 1 t lim 0 t lim 0 1 8 Or, in terms of PDF... Cumulative distribution function (CDF): F X ( t ) = Pr [ X ≤ t ] . ∫ t F X ( t ) = f X ( z ) dz −∞ Pr [ X ∈ ( a , b ]] = Pr [ X ≤ b ] − Pr [ X ≤ a ] = F X ( b ) − F X ( a )
F X t F X t CDF lim 1 t lim 0 t 8 Or, in terms of PDF... Cumulative distribution function (CDF): F X ( t ) = Pr [ X ≤ t ] . ∫ t F X ( t ) = f X ( z ) dz −∞ Pr [ X ∈ ( a , b ]] = Pr [ X ≤ b ] − Pr [ X ≤ a ] = F X ( b ) − F X ( a ) F X ( t ) ∈ [ 0 , 1 ]
F X t CDF lim 1 t lim 0 8 Or, in terms of PDF... Cumulative distribution function (CDF): F X ( t ) = Pr [ X ≤ t ] . ∫ t F X ( t ) = f X ( z ) dz −∞ Pr [ X ∈ ( a , b ]] = Pr [ X ≤ b ] − Pr [ X ≤ a ] = F X ( b ) − F X ( a ) F X ( t ) ∈ [ 0 , 1 ] t →−∞ F X ( t ) =
F X t CDF lim 1 t lim 8 Or, in terms of PDF... Cumulative distribution function (CDF): F X ( t ) = Pr [ X ≤ t ] . ∫ t F X ( t ) = f X ( z ) dz −∞ Pr [ X ∈ ( a , b ]] = Pr [ X ≤ b ] − Pr [ X ≤ a ] = F X ( b ) − F X ( a ) F X ( t ) ∈ [ 0 , 1 ] t →−∞ F X ( t ) = 0
CDF lim 1 lim 8 Or, in terms of PDF... Cumulative distribution function (CDF): F X ( t ) = Pr [ X ≤ t ] . ∫ t F X ( t ) = f X ( z ) dz −∞ Pr [ X ∈ ( a , b ]] = Pr [ X ≤ b ] − Pr [ X ≤ a ] = F X ( b ) − F X ( a ) F X ( t ) ∈ [ 0 , 1 ] t →−∞ F X ( t ) = 0 t →∞ F X ( t ) =
CDF lim lim 8 Or, in terms of PDF... Cumulative distribution function (CDF): F X ( t ) = Pr [ X ≤ t ] . ∫ t F X ( t ) = f X ( z ) dz −∞ Pr [ X ∈ ( a , b ]] = Pr [ X ≤ b ] − Pr [ X ≤ a ] = F X ( b ) − F X ( a ) F X ( t ) ∈ [ 0 , 1 ] t →−∞ F X ( t ) = 0 t →∞ F X ( t ) = 1
In Pictures 9
tf X t dt g t f X t dt Expectation bY Exercise: try proving these yourself. Proof: also similar to discrete case. E X E Y E Z . If X Y Z are mutually independent, then E XYZ Proof: similar to discrete case. bE Y aE X Linearity of expectation: E aX E g X Expectation of a function? E X integral. Sum Continuous case? 10 Discrete case: E [ X ] = ∑ ∞ t = −∞ ( Pr [ X = t ] t )
g t f X t dt Expectation E aX Exercise: try proving these yourself. Proof: also similar to discrete case. E X E Y E Z . If X Y Z are mutually independent, then E XYZ Proof: similar to discrete case. bE Y aE X bY Linearity of expectation: E g X Expectation of a function? 10 Discrete case: E [ X ] = ∑ ∞ t = −∞ ( Pr [ X = t ] t ) Continuous case? Sum → integral. ∫ ∞ E [ X ] = tf X ( t ) dt −∞
Expectation Linearity of expectation: Exercise: try proving these yourself. Proof: also similar to discrete case. E X E Y E Z . If X Y Z are mutually independent, then E XYZ Proof: similar to discrete case. bE Y aE X bY E aX 10 Expectation of a function? Discrete case: E [ X ] = ∑ ∞ t = −∞ ( Pr [ X = t ] t ) Continuous case? Sum → integral. ∫ ∞ E [ X ] = tf X ( t ) dt −∞ ∫ ∞ E [ g ( X )] = g ( t ) f X ( t ) dt −∞
Expectation Linearity of expectation: Exercise: try proving these yourself. Proof: also similar to discrete case. E X E Y E Z . If X Y Z are mutually independent, then E XYZ Proof: similar to discrete case. 10 Expectation of a function? Discrete case: E [ X ] = ∑ ∞ t = −∞ ( Pr [ X = t ] t ) Continuous case? Sum → integral. ∫ ∞ E [ X ] = tf X ( t ) dt −∞ ∫ ∞ E [ g ( X )] = g ( t ) f X ( t ) dt −∞ E [ aX + bY ] = aE [ X ] + bE [ Y ]
Expectation Linearity of expectation: Exercise: try proving these yourself. Proof: also similar to discrete case. E X E Y E Z . If X Y Z are mutually independent, then E XYZ Proof: similar to discrete case. 10 Expectation of a function? Discrete case: E [ X ] = ∑ ∞ t = −∞ ( Pr [ X = t ] t ) Continuous case? Sum → integral. ∫ ∞ E [ X ] = tf X ( t ) dt −∞ ∫ ∞ E [ g ( X )] = g ( t ) f X ( t ) dt −∞ E [ aX + bY ] = aE [ X ] + bE [ Y ]
Expectation Expectation of a function? Exercise: try proving these yourself. Proof: also similar to discrete case. Proof: similar to discrete case. Linearity of expectation: 10 Discrete case: E [ X ] = ∑ ∞ t = −∞ ( Pr [ X = t ] t ) Continuous case? Sum → integral. ∫ ∞ E [ X ] = tf X ( t ) dt −∞ ∫ ∞ E [ g ( X )] = g ( t ) f X ( t ) dt −∞ E [ aX + bY ] = aE [ X ] + bE [ Y ] If X , Y , Z are mutually independent, then E [ XYZ ] = E [ X ] E [ Y ] E [ Z ] .
Expectation Expectation of a function? Exercise: try proving these yourself. Proof: also similar to discrete case. Proof: similar to discrete case. Linearity of expectation: 10 Discrete case: E [ X ] = ∑ ∞ t = −∞ ( Pr [ X = t ] t ) Continuous case? Sum → integral. ∫ ∞ E [ X ] = tf X ( t ) dt −∞ ∫ ∞ E [ g ( X )] = g ( t ) f X ( t ) dt −∞ E [ aX + bY ] = aE [ X ] + bE [ Y ] If X , Y , Z are mutually independent, then E [ XYZ ] = E [ X ] E [ Y ] E [ Z ] .
Expectation Expectation of a function? Exercise: try proving these yourself. Proof: also similar to discrete case. Proof: similar to discrete case. Linearity of expectation: 10 Discrete case: E [ X ] = ∑ ∞ t = −∞ ( Pr [ X = t ] t ) Continuous case? Sum → integral. ∫ ∞ E [ X ] = tf X ( t ) dt −∞ ∫ ∞ E [ g ( X )] = g ( t ) f X ( t ) dt −∞ E [ aX + bY ] = aE [ X ] + bE [ Y ] If X , Y , Z are mutually independent, then E [ XYZ ] = E [ X ] E [ Y ] E [ Z ] .
Variance Variance is defined exactly like it is for the discrete case. The standard properties for variance hold in the continuous case as well. Var aX a 2 Var X For independent r.v. X , Y : Var X Y Var X Var Y . 11 Var ( X ) = E [( X − E [ X ]) 2 ] = E [ X 2 ] − E [ X ] 2
Variance Variance is defined exactly like it is for the discrete case. The standard properties for variance hold in the continuous case as well. For independent r.v. X , Y : . 11 Var ( X ) = E [( X − E [ X ]) 2 ] = E [ X 2 ] − E [ X ] 2 Var ( aX ) = a 2 Var ( X ) Var ( X + Y ) = Var ( X ) + Var ( Y )
Target shooting Suppose an archer always hits a circular target with 1 meter radius, and the exact point that he hits is distributed uniformly across the target. What is the distribution the distance between his arrow and the center (call this r.v. X )? t 1 Probability that arrow is closer than t to the center? Pr X t area of small circle area of dartboard t 2 t 2 12
Target shooting Suppose an archer always hits a circular target with 1 meter radius, and the exact point that he hits is distributed uniformly across the target. What is the distribution the distance between his arrow and the center (call this r.v. X )? t 1 Probability that arrow is closer than t to the center? Pr X t area of small circle area of dartboard t 2 t 2 12
Target shooting Suppose an archer always hits a circular target with 1 meter radius, and the exact point that he hits is distributed uniformly across the target. What is the distribution the distance between his arrow and the center (call this r.v. X )? t 1 Probability that arrow is closer than t to the center? Pr X t area of small circle area of dartboard t 2 t 2 12
Target shooting Suppose an archer always hits a circular target with 1 meter radius, and the exact point that he hits is distributed uniformly across the target. What is the distribution the distance between his arrow and the center (call this r.v. X )? t 1 Probability that arrow is closer than t to the center? area of small circle area of dartboard t 2 t 2 12 Pr [ X ≤ t ] =
Target shooting Probability that arrow is closer than t to the center? t 2 area of dartboard area of small circle Suppose an archer always hits a circular target with 1 meter radius, 12 1 t the center (call this r.v. X )? target. What is the distribution the distance between his arrow and and the exact point that he hits is distributed uniformly across the Pr [ X ≤ t ] = π t 2 = π
Target shooting Suppose an archer always hits a circular target with 1 meter radius, and the exact point that he hits is distributed uniformly across the target. What is the distribution the distance between his arrow and the center (call this r.v. X )? t 1 Probability that arrow is closer than t to the center? area of small circle area of dartboard 12 Pr [ X ≤ t ] = π t 2 = π = t 2 .
Target shooting II CDF: otherwise 0 1 t for 0 2 t PDF? 1 13 t 2 0 for t < 0 F Y ( t ) = Pr [ Y ≤ t ] = for 0 ≤ t ≤ 1 for t > 1 f Y ( t ) = F Y ( t ) ′ =
Target shooting II t 2 otherwise 0 2 t PDF? 1 CDF: 13 0 for t < 0 F Y ( t ) = Pr [ Y ≤ t ] = for 0 ≤ t ≤ 1 for t > 1 { for 0 ≤ t ≤ 1 f Y ( t ) = F Y ( t ) ′ =
Target shooting III 2 t 1: 2 t PDF for t Probability of hitting the ring: 2 t . 2 t 2 2 t t 2 2 t 2 Another way of attacking the same problem: what’s the probability t 2 2 t Area of ring: Area of circle: t 14 of hitting some ring with inner radius t and outer radius t + δ for small δ ? t + δ
Target shooting III 2 t 1: 2 t PDF for t Probability of hitting the ring: 2 t . 2 t 2 2 t t 2 2 t 2 Another way of attacking the same problem: what’s the probability t 2 2 t Area of ring: t 14 of hitting some ring with inner radius t and outer radius t + δ for small δ ? t + δ Area of circle: π
Target shooting III 2 1: 2 t PDF for t Probability of hitting the ring: 2 t . 2 t 2 2 t t 2 2 t Another way of attacking the same problem: what’s the probability t 2 Area of ring: t 14 of hitting some ring with inner radius t and outer radius t + δ for small δ ? t + δ Area of circle: π π (( t + δ ) 2 − t 2 )
Target shooting III Another way of attacking the same problem: what’s the probability t Area of ring: 2 t 2 2 t Probability of hitting the ring: 2 t . PDF for t 1: 2 t 14 of hitting some ring with inner radius t and outer radius t + δ for small δ ? t + δ Area of circle: π π (( t + δ ) 2 − t 2 ) = π ( t 2 + 2 t δ + δ 2 − t 2 )
Target shooting III Another way of attacking the same problem: what’s the probability t Area of ring: 2 t Probability of hitting the ring: 2 t . PDF for t 1: 2 t 14 of hitting some ring with inner radius t and outer radius t + δ for small δ ? t + δ Area of circle: π π (( t + δ ) 2 − t 2 ) = π ( t 2 + 2 t δ + δ 2 − t 2 ) = π ( 2 t δ + δ 2 )
Target shooting III Another way of attacking the same problem: what’s the probability t Area of ring: Probability of hitting the ring: 2 t . PDF for t 1: 2 t 14 of hitting some ring with inner radius t and outer radius t + δ for small δ ? t + δ Area of circle: π π (( t + δ ) 2 − t 2 ) = π ( t 2 + 2 t δ + δ 2 − t 2 ) = π ( 2 t δ + δ 2 ) ≈ π 2 t δ
Target shooting III Another way of attacking the same problem: what’s the probability t Area of ring: PDF for t 1: 2 t 14 of hitting some ring with inner radius t and outer radius t + δ for small δ ? t + δ Area of circle: π π (( t + δ ) 2 − t 2 ) = π ( t 2 + 2 t δ + δ 2 − t 2 ) = π ( 2 t δ + δ 2 ) ≈ π 2 t δ Probability of hitting the ring: 2 t δ .
Target shooting III Another way of attacking the same problem: what’s the probability t Area of ring: 14 of hitting some ring with inner radius t and outer radius t + δ for small δ ? t + δ Area of circle: π π (( t + δ ) 2 − t 2 ) = π ( t 2 + 2 t δ + δ 2 − t 2 ) = π ( 2 t δ + δ 2 ) ≈ π 2 t δ Probability of hitting the ring: 2 t δ . PDF for t ≤ 1: 2 t
Left-hand side is f Y y f Y y b b b f X y a b Shifting & Scaling y . Hence, 1 bf X y a b a b 15 Pr X Then Pr Y y y Pr a bX y y y a a b y a b Pr X y Let f X ( x ) be the pdf of X and Y = a + bX where b > 0.
Left-hand side is f Y y f Y y Shifting & Scaling b b a y bf X 1 . Hence, b b a y f X b a 15 y Pr X y a b y a b Pr X y a b Let f X ( x ) be the pdf of X and Y = a + bX where b > 0. Then Pr [ Y ∈ ( y , y + δ )] = Pr [ a + bX ∈ ( y , y + δ )]
Left-hand side is f Y y f Y y Shifting & Scaling b b a y bf X 1 . Hence, b b a y f X b a y b a y Pr X b b 15 Let f X ( x ) be the pdf of X and Y = a + bX where b > 0. Then Pr [ Y ∈ ( y , y + δ )] = Pr [ a + bX ∈ ( y , y + δ )] Pr [ X ∈ ( y − a , y + δ − a = )]
Left-hand side is f Y y f Y y Shifting & Scaling f X b a y bf X 1 . Hence, b b a y 15 b b b b Let f X ( x ) be the pdf of X and Y = a + bX where b > 0. Then Pr [ Y ∈ ( y , y + δ )] = Pr [ a + bX ∈ ( y , y + δ )] Pr [ X ∈ ( y − a , y + δ − a = )] Pr [ X ∈ ( y − a , y − a + δ = b )]
Left-hand side is f Y y f Y y Shifting & Scaling b b a y bf X 1 . Hence, b b 15 b b Let f X ( x ) be the pdf of X and Y = a + bX where b > 0. Then Pr [ Y ∈ ( y , y + δ )] = Pr [ a + bX ∈ ( y , y + δ )] Pr [ X ∈ ( y − a , y + δ − a = )] Pr [ X ∈ ( y − a , y − a + δ = b )] f X ( y − a ) δ = b .
f Y y f Y y Shifting & Scaling b b a y bf X 1 . Hence, Left-hand side is b 15 b b b Let f X ( x ) be the pdf of X and Y = a + bX where b > 0. Then Pr [ Y ∈ ( y , y + δ )] = Pr [ a + bX ∈ ( y , y + δ )] Pr [ X ∈ ( y − a , y + δ − a = )] Pr [ X ∈ ( y − a , y − a + δ = b )] f X ( y − a ) δ = b .
f Y y Shifting & Scaling b b a y bf X 1 Hence, b b 15 b b Let f X ( x ) be the pdf of X and Y = a + bX where b > 0. Then Pr [ Y ∈ ( y , y + δ )] = Pr [ a + bX ∈ ( y , y + δ )] Pr [ X ∈ ( y − a , y + δ − a = )] Pr [ X ∈ ( y − a , y − a + δ = b )] f X ( y − a ) δ = b . Left-hand side is f Y ( y ) δ .
Shifting & Scaling b b b b b 15 b Let f X ( x ) be the pdf of X and Y = a + bX where b > 0. Then Pr [ Y ∈ ( y , y + δ )] = Pr [ a + bX ∈ ( y , y + δ )] Pr [ X ∈ ( y − a , y + δ − a = )] Pr [ X ∈ ( y − a , y − a + δ = b )] f X ( y − a ) δ = b . Left-hand side is f Y ( y ) δ . Hence, bf X ( y − a f Y ( y ) = 1 ) .
Shifting & Scaling b b b b b 15 b Let f X ( x ) be the pdf of X and Y = a + bX where b > 0. Then Pr [ Y ∈ ( y , y + δ )] = Pr [ a + bX ∈ ( y , y + δ )] Pr [ X ∈ ( y − a , y + δ − a = )] Pr [ X ∈ ( y − a , y − a + δ = b )] f X ( y − a ) δ = b . Left-hand side is f Y ( y ) δ . Hence, bf X ( y − a f Y ( y ) = 1 ) .
Continuous Distributions
Uniform Distribution: CDF and PDF b b . b , and 1 for t t a for a b a a , t 0 for t a dz 1 t CDF? a What’s the value of the constant in the interval? 16 PDF is constant over some interval [ a , b ] , zero outside the interval. ∫ ∞ ∫ b kdt = kdt = b − a = 1 −∞ so PDF is 1 / ( b − a ) in [ a , b ] and 0 otherwise.
Uniform Distribution: CDF and PDF What’s the value of the constant in the interval? a CDF? 16 PDF is constant over some interval [ a , b ] , zero outside the interval. ∫ ∞ ∫ b kdt = kdt = b − a = 1 −∞ so PDF is 1 / ( b − a ) in [ a , b ] and 0 otherwise. ∫ t 1 / ( b − a ) dz −∞ 0 for t < a , ( t − a ) / ( b − a ) for a < t < b , and 1 for t > b .
Uniform Distribution: CDF and PDF, Graphically otherwise b<t 1 0 17 0 t < a { 1 / ( b − a ) a < t < b f X ( t ) = F X ( t ) = ( t − a ) / ( b − a ) a < t < b
E X 2 b 2 3 b b a dt b a 2 2 t 3 Uniform Distribution: Expectation and Variance a t 2 a b a 2 2 a 12 b b a b 2 E X b a t b adt 1 2 a 2 Expectation? b a b a 2 Variance? Var X E X 2 18
E X 2 b 2 Uniform Distribution: Expectation and Variance b 2 2 t 3 3 b a a b b a 2 2 a 12 a b a dt 2 a t 2 Expectation? 18 Variance? Var X E X 2 b a t 2 b 2 − a 2 b − a = b + a ∫ b E [ X ] = b − adt = 1
b 2 Uniform Distribution: Expectation and Variance b 2 2 t 3 3 b a b a b a 2 2 a 12 a a dt Expectation? 2 a t 2 b 18 Variance? b a t 2 b 2 − a 2 b − a = b + a ∫ b E [ X ] = b − adt = 1 Var [ X ] = E [ X 2 ] − E [ X ] 2
b 2 Uniform Distribution: Expectation and Variance a 12 a 2 2 a b a b a 3 b t 3 Expectation? t 2 18 Variance? a t 2 2 b 2 − a 2 b − a = b + a ∫ b E [ X ] = b − adt = 1 Var [ X ] = E [ X 2 ] − E [ X ] 2 ( b + a ∫ b ) 2 = b − a dt − ‘ 2
b 2 Uniform Distribution: Expectation and Variance 2 12 a t 3 Expectation? a Variance? t 2 18 2 a t b 2 − a 2 b − a = b + a ∫ b E [ X ] = b − adt = 1 Var [ X ] = E [ X 2 ] − E [ X ] 2 ( b + a ∫ b ) 2 = b − a dt − ‘ 2 � ( b + a ) 2 = a − � b 3 ( b − a ) ‘ 2
Uniform Distribution: Expectation and Variance 2 12 t 3 Expectation? a Variance? t 2 18 t 2 a b 2 − a 2 b − a = b + a ∫ b E [ X ] = b − adt = 1 Var [ X ] = E [ X 2 ] − E [ X ] 2 ( b + a ∫ b ) 2 = b − a dt − ‘ 2 � ( b + a ) 2 = a − � b 3 ( b − a ) ‘ 2 = ( a − b ) 2
Recommend
More recommend