probability and statistics
play

Probability and Statistics for Computer Science I have now used - PowerPoint PPT Presentation

Probability and Statistics for Computer Science I have now used each of the terms mean, variance, covariance and standard devia5on in two slightly different ways. ---Prof. Forsythe Credit: wikipedia Hongye Liu, Teaching Assistant


  1. Probability and Statistics ì for Computer Science “I have now used each of the terms mean, variance, covariance and standard devia5on in two slightly different ways.” ---Prof. Forsythe Credit: wikipedia Hongye Liu, Teaching Assistant Prof, CS361, UIUC, 9.17.2020

  2. hand of 5 from 52 Cards No pain in a hands of The probability of drawing no pairs have 5- cards that . ( ng replacement ) doesn't matter * Consider the order [ El = ? . 45 - +36 =L 's ) 52 × 48 x - - # Irl = ?

  3. hand of 5 from 52 Cards No pain in a hands of The probability of drawing no pairs have 5- cards that . ( ng replacement ) doesn't matter * Consider the order suits for 5 cards . 45 ' f ) - decide on ( I ( El = ? - - ↳ choose The number K ? ' " ' ' - Q . Hml " " ' ' ( 53 ) Irl = ?

  4. ' Tei : p=Y here we sure make in rYE* ,

  5. the card i. e one . be to hampers a that heart pick Choi us to a 13 - s 13 card - pick Choices a to → fz g- z I - 4 Suits of 4 out ,

  6. Last time XCW ) variable Random K w → - head * Definition × =y low - - tail w - * probability distribution apex - x ) - CDF PDF a , * Conditional probability distr : . pixel PIXIY ) → pcfw . si.xcwt-x.ly

  7. 50 dollars 50-3 w = head X ( W ) =/ - so 50 bet the on w= tail 13 pix ) 50 K= " " " " t A i is : Io × ' K ' so - . Summarize ! !

  8. Objectives c R - V . ) variable Random X * Definition value Expected properties f CX ) * - fix . 's ) & covariance Variance * # Markov 's Inequality *

  9. case ) Expected value ( Discrete � The expected value (or expecta/on ) of a random variable X is ppl X=x ) � E [ X ] = xP ( x ) x The expected value is a weighted sum of the values X can take 1- all

  10. Expected value � The expected value of a random variable X is <= 1 heard I � E [ X ] = xP ( x ) ' psi , + I tot 'T O I pose , =/ x R The expected value is a weighted sum of the values X can take 1- all

  11. Expected value: profit � A company has a project that has p 00 probability of earning 10 million and 1-p probability of losing 10 million. � Let X be the return of the project. pcX=K ) EfxT= to .pt C- to )xu - p ) A = up Zo - lo pp ) ifp p > I X

  12. Cookies Solve FITE ME ¥7 at home I - 92 each $1 each value = ? Expected A) random draw I 4 items from in deputy 1 twice replacement with draw D) random you draws the the game , are f if two the prize , get Expected value ? =

  13. Linearity of Expectation � For random variables X and Y and constants k,c � Scaling property E [ kX ] = kE [ X ] � Addi5vity E [ X + Y ] = E [ X ] + E [ Y ] � And E [ kX + c ] = kE [ X ] + c

  14. Linearity of Expectation � Proof of the addi5ve property SIXTY E [ X + Y ] = E [ X ] + E [ Y ] s PCs ) E[XtY1= ECS ] = -2 s SE p ix. y , = E - key } Pc SIS ) + pcx-IEI.IT/=EEgcxeysPcx.H f- * HIS -

  15. E- Ext 'll = ECS ] = E s PCs ) s s E p ix. y , = E PIG - neg } - * His Is × , 4=8 s -_ Key ) - - = § Eycxty )P ' KY ) a = I } xpsx.gs/-ZEgypcx.y ) ' ' ) = xz E- [ X , -1 × 4 + Try -3,2pA 'D - , xp t -39,3%8 ) =ECxitECXTt =-3 - ' - - = I xpcxlt Tgypcygt - → e ENT ECT ) =

  16. Q. What’s the value? elfin ① " ' � What is E[E[X] +1 ]? . = ECT t l A. E[X]+1 B. 1 C. 0 - Ef EM t I - = = EKITI pay , = I ECT ) = Yxl EKITI =

  17. Expected value of a function of X � If f is a func5on of a random variable X , then Y = f ( X ) is a random variable too � The expected value of Y = f ( X ) is � E [ Y ] = E [ f ( X )] = f ( x ) P ( x ) x

  18. The theorem exchange of variable ' y → each If each f- c X = Y x ECT ) = I ypce , Bi - ie " ECT ) -_ Eyypcy ) → - x ) . pity , =P CX - - . -2 tax ) Pix , = K " t multiple T one y I sore x. → single xx style y I - I yP ↳ 4tyE±yp 7 x = -22 Ply ) - - g ) E n YEI g. ply ) peg ) = E y pix , . Epix ) = Epoxy = y x

  19. Expected time of cat � A cat moves with random constant speed V , either 5mile/hr or 20mile/hr with equal probability, what’s the expected 5me for it to travel 50 miles? D- = f ' V ) T = Earl a • vs t EITI Z type . Paris + ¥ . p cuz , = ÷ xtz-6.rs x 'T t =

  20. Q: Is this statement true? If there exists a constant such that P ( X ≥ a) = 1, then E[ X ] ≥ a . It is: - A. True B. False E CX = I xp ex ) = IIP P' I ' ax t - 3. I a pix ) l " ' = a ¥9 x' a

  21. Variance and standard deviation � The variance of a random ' f- CX ) = ( X - E Cx ) ) variable X is var [ X ] = E [( X − E [ X ]) 2 ] - � The standard devia5on of a { ki } random variable X is N work xi } ) s ' IC Xi - te � std [ X ] = var [ X ] = T v

  22. Properties of variance � For random variable X and constant k var [ X ] ≥ 0 var [ kX ] = k 2 var [ X ]

  23. A neater expression for variance � Variance of Random Variable X is X - ECM 22 Y defined as: = E C fix > I var [ X ] = E [( X − E [ X ]) 2 ] TT = E CY ] = E y pig , � It’s the same as: y = I C X - E Cx ) ) ? pix ) x var [ X ] = E [ X 2 ] − E [ X ] 2 -

  24. q = Heard " XCW ) =/ w = tail o ✓ - I am - Ean a- Iie - y - E- EYE f - H * 5=4 w - ( x - T 7¥ - " ¥ w T

  25. A neater expression for variance var [ X ] = E [( X − E [ X ]) 2 ]

  26. A neater expression for variance • → var [ X ] = E [( X − E [ X ]) 2 ] number var [ X ] = E [( X − µ ) 2 ] where µ = E [ X ] - zuxtle ' ) = El x' IEC - innate cut = Fec Efx 't - zu ECHT E- Ceil = E Ext - ZCEEXIJI E CENT I ¥472 =

  27. A neater expression for variance var [ X ] = E [( X − E [ X ]) 2 ] - var [ X ] = E [( X − µ ) 2 ] where µ = E [ X ] = E [ X 2 − 2 Xµ + µ 2 ] =E=fxI2

  28. Variance: the profit example � For the profit example, what is the variance of the return? We know E[ X ]= ¥ 20p-10 * ⇐ - ¥i . var [ X ] = E [ X 2 ] − ( E [ X ]) 2 = (10 2 p + ( − 10) 2 (1 − p )) − (20 p − 10) 2 Eh # § x ' = 100 − (400 p 2 − 400 p + 100) - pox , = - co ) = to patio ) -1 Ho ) ? pix = 400 p (1 − p ) = loop t cool I - p )

  29. Motivation for covariance � Study the rela5onship between random variables � Note that it’s the un-normalized correla5on � Applica5ons include the fire control of radar, communica5ng in the presence of noise.

  30. Covariance � The covariance of random variables X and Y is cov ( X, Y ) = E [( X − E [ X ])( Y − E [ Y ])] � Note that cov ( X, X ) = E [( X − E [ X ]) 2 ] = var [ X ]

  31. A neater form for covariance � A neater expression for covariance (similar deriva5on as for variance) = -zIt € , cy.eim.px.gs cov ( X, Y ) = E [ XY ] − E [ X ] E [ Y ] "

  32. Correlation coefficient is normalized covariance � The correla5on coefficient is corr ( X, Y ) = cov ( X, Y ) σ X σ Y � When X, Y takes on values with equal probability to generate data sets {( x,y )}, the correla5on coefficient will be as seen in Chapter 2.

  33. Correlation coefficient is normalized covariance � The correla5on coefficient can also be wrilen as: corr ( X, Y ) = E [ XY ] − E [ X ] E [ Y ] σ X σ Y

  34. Covariance seen from scatter plots Zero Posi5ve Nega5ve Covariance Covariance Covariance Credit: Prof.Forsyth

  35. When correlation coefficient or covariance is zero � The covariance is 0! � That is: CoV Cx , Y ) E [ XY ] − E [ X ] E [ Y ] = 0 = E [ XY ] = E [ X ] E [ Y ] � This is a necessary property of independence of random variables * (not equal to independence) not sufficient

  36. Variance of the sum of two random variables var [ X + Y ] = var [ X ] + var [ Y ] + 2 cov ( X, Y ) HW in

  37. are equivalent These , 41=0 ; Corral ,Yj=o Cov CX (1) = ECXIECY ] ECXYI ( I ) - uarfxfiuascy ] var [ Xt 's ] - ( II ) uncorrelated ! !

  38. Properties of independence in terms of expectations � E [ XY ] = E [ X ] E [ Y ]

  39. X , Y independent If are - Corr CX , -9=0 Cove X. 41=0 , then { uarfxtyj-uarcxltuas.IT ] = EAT ECT ] ECXY )

  40. Q: What is this expectation? � We toss two iden5cal coins A & B independently for three 5mes and 4 5mes respec5vely, for each head we earn $1, we define X is the earning from A and Y is the earning from B. What is E( XY )? A. $2 B. $3 C. $4

  41. Uncorrelated vs Independent � If two random variables are uncorrelated, does this mean they are independent? Inves5gate the case X takes -1, 0, 1 with equal probability and Y=X 2 .

  42. Y ' ⇐ - ' y y -_ I z pix , -47 y= . " ° pigs o yer ¥*¥s × y= , " I 25-1 ' I * 's i. * O l - l pix . 2) x y y till ÷ Efx 'll -0 i pax.yi-tt.MY#-i- I - i y ' a' I ' ot . o T

  43. Assignments � Finish Chapter 4 of the textbook � Next 5me: Proof of Chebyshev inequality & Weak law of large numbers, Con5nuous random variable

  44. Additional References � Charles M. Grinstead and J. Laurie Snell "Introduc5on to Probability” � Morris H. Degroot and Mark J. Schervish "Probability and Sta5s5cs”

  45. See you next time See You!

Recommend


More recommend