cpsc 531
play

CPSC 531: System Modeling and Simulation Carey Williamson - PowerPoint PPT Presentation

CPSC 531: System Modeling and Simulation Carey Williamson Department of Computer Science University of Calgary Fall 2017 Overview The world a model-builder sees is probabilistic rather than deterministic: Some probability model might


  1. CPSC 531: System Modeling and Simulation Carey Williamson Department of Computer Science University of Calgary Fall 2017

  2. Overview ▪ The world a model-builder sees is probabilistic rather than deterministic: — Some probability model might well describe the variations Goals: ▪ Review the fundamental concepts of probability ▪ Understand the difference between discrete and continuous random variable ▪ Review the most common probability models 2

  3. Outline ▪ Probability and random variables — Random experiment and random variable — Probability mass/density functions — Expectation, variance, covariance, correlation ▪ Probability distributions — Discrete probability distributions — Continuous probability distributions — Empirical probability distributions 3

  4. Outline ▪ Probability and random variables — Random experiment and random variable — Probability mass/density functions — Expectation, variance, covariance, correlation ▪ Probability distributions — Discrete probability distributions — Continuous probability distributions — Empirical probability distribution 4

  5. Probability Is widely used in mathematics, science, engineering, finance and philosophy to draw conclusions about the likelihood of potential events and the underlying mechanics of complex systems ▪ Probability is a measure of how likely it is for an event to happen ▪ We measure probability with a number between 0 and 1 ▪ If an event is certain to happen, then the probability of the event is 1 ▪ If an event is certain not to happen, then the probability of the event is 0 5

  6. Random Experiment ▪ An experiment is called random if the outcome of the experiment is uncertain ▪ For a random experiment: — The set of all possible outcomes is known before the experiment — The outcome of the experiment is not known in advance ▪ Sample space Ω of an experiment is the set of all possible outcomes of the experiment ▪ Example: Consider random experiment of tossing a coin twice. Sample space is: Ω = { 𝐼, 𝐼 , 𝐼, 𝑈 , 𝑈, 𝐼 , (𝑈, 𝑈)} 6

  7. Probability of Events ▪ An event is a subset of sample space Example 1 : in tossing a coin twice, E={(H,H)} is the event of having two heads Example 2 : in tossing a coin twice, E={(H,H), (H,T)} is the event of having a head in the first toss ▪ Probability of an event E is a numerical measure of the likelihood that event 𝐹 will occur, expressed as a number between 0 and 1 , 0 ≤ ℙ 𝐹 ≤ 1 — If all possible outcomes are equally likely: ℙ 𝐹 = 𝐹 /|Ω| — Probability of the sample space is 1: ℙ Ω = 1 7

  8. Joint Probability ▪ Probability that two events 𝐵 and 𝐶 occur in a single experiment: ℙ 𝐵 and 𝐶 = ℙ 𝐵 ∩ 𝐶 ▪ Example: drawing a single card at random from a regular deck of cards, probability of getting a red king — 𝐵 : getting a red card — 𝐶 : getting a king 2 — ℙ 𝐵 ∩ 𝐶 = 52 8

  9. Independent Events ▪ Two events 𝐵 and 𝐶 are independent if the occurrence of one does not affect the occurrence of the other: ℙ 𝐵 ∩ 𝐶 = ℙ 𝐵 ℙ(𝐶) ▪ Example: drawing a single card at random from a regular deck of cards, probability of getting a red king — 𝐵 : getting a red card ⇒ ℙ 𝐵 = 26/52 — 𝐶 : getting a king ⇒ ℙ 𝐶 = 4/52 2 — ℙ 𝐵 ∩ 𝐶 = 52 = ℙ 𝐵 ℙ 𝐶 ⇒ 𝐵 and 𝐶 are independent 9

  10. ҧ Mutually Exclusive Events ▪ Events 𝐵 and 𝐶 are mutually exclusive if the occurrence of one implies the non-occurrence of the other, i.e., 𝐵 ∩ 𝐶 = 𝜚 : ℙ 𝐵 ∩ 𝐶 = 0 ▪ Example: drawing a single card at random from a regular deck of cards, probability of getting a red club — 𝐵 : getting a red card — 𝐶 : getting a club — ℙ 𝐵 ∩ 𝐶 = 0 ▪ Complementary event of event 𝐵 is event [𝑜𝑝𝑢 𝐵] , i.e., the event that 𝐵 does not occur, denoted by ҧ 𝐵 — Events 𝐵 and ҧ 𝐵 are mutually exclusive — ℙ 𝐵 = 1 − ℙ(𝐵) 10

  11. Union Probability ▪ Union of events 𝐵 and 𝐶 : ℙ 𝐵 or 𝐶 = ℙ 𝐵 ∪ 𝐶 = ℙ 𝐵 + ℙ 𝐶 − ℙ 𝐵 ∩ 𝐶 ▪ If 𝐵 and 𝐶 are mutually exclusive: ℙ 𝐵 ∪ 𝐶 = ℙ 𝐵 + ℙ(𝐶) ▪ Example: drawing a single card at random from a regular deck of cards, probability of getting a red card or a king — 𝐵 : getting a red card ⇒ ℙ 𝐵 = 26/52 — 𝐶 : getting a king ⇒ ℙ 𝐶 = 4/52 2 — ℙ 𝐵 ∩ 𝐶 = 52 26 4 2 28 — ℙ 𝐵 ∪ 𝐶 = ℙ 𝐵 + ℙ 𝐶 − ℙ 𝐵 ∩ 𝐶 = 52 + 52 − 52 = 52 11

  12. Conditional Probability ▪ Probability of event 𝐵 given the occurrence of some event 𝐶 : ℙ 𝐵 𝐶 = ℙ 𝐵 ∩ 𝐶 ℙ(𝐶) ▪ If events 𝐵 and 𝐶 are independent: ℙ 𝐵 𝐶 = ℙ(𝐵)ℙ(𝐶) = 𝑄 𝐵 ℙ(𝐶) ▪ Example: drawing a single card at random from a regular deck of cards, probability of getting a king given that the card is red — 𝐵 : getting a red card ⇒ ℙ 𝐵 = 26/52 — 𝐶 : getting a king ⇒ ℙ 𝐶 = 4/52 2 — ℙ 𝐵 ∩ 𝐶 = 52 — ℙ 𝐶|𝐵 = ℙ(𝐶∩𝐵) = 2 26 = ℙ 𝐶 ℙ(𝐵) 12

  13. Random Variable ▪ A numerical value can be associated with each outcome of an experiment ▪ A random variable X is a function from the sample space  to the real line that assigns a real number X(s) to each element s of  X:  → R ▪ Random variable takes on its values with some probability 13

  14. Random Variable ▪ Example: Consider random experiment of tossing a coin twice. Sample space is:  = {(H,H), (H,T), (T,H), (T,T)} Define random variable X as the number of heads in the experiment: X((T,T)) = 0, X((H,T))=1, X((T,H)) = 1, X((H,H))=2 ▪ Example: Rolling a die. Sample space  = {1,2,3,4,5,6). Define random variable X as the number rolled: X(j) = j, 1 ≤ j ≤ 6 14

  15. Random Variable ▪ Example: roll two fair dice and observe the outcome Sample space = {(i,j ) | 1 ≤ i ≤ 6, 1 ≤ j ≤ 6} i : integer from the first die j : integer from the second die (1,1) (1,2) (1,3) (1,4) (1,5) (1,6) (2,1) (2,2) (2,3) (2,4) (2,5) (2,6) (3,1) (3,2) (3,3) (3,4) (3,5) (3,6) (4,1) (4,2) (4,3) (4,4) (4,5) (4,6) (5,1) (5,2) (5,3) (5,4) (5,5) (5,6) (6,1) (6,2) (6,3) (6,4) (6,5) (6,6) Possible outcomes 15

  16. Random Variable ▪ Random variable 𝑌 : sum of the two faces of the dice X(i,j) = i+j — ℙ (X = 12) = ℙ ( (6,6) ) = 1/36 — ℙ (X = 10) = ℙ ( (5,5), (4,6), (6,4) ) = 3/36 ▪ Random variable Y : value of the first die — ℙ (Y = 1) = 1/6 — ℙ ( Y = i) = 1/6 , 1 ≤ i ≤ 6 16

  17. Types of Random Variables ▪ Discrete — Random variables whose set of possible values can be written as a finite or infinite sequence — Example: number of requests sent to a web server ▪ Continuous — Random variables that take a continuum of possible values — Example: time between requests sent to a web server 17

  18. Probability Mass Function (PMF) ▪ X : discrete random variable ▪ 𝑞(𝑦 𝑗 ) : probability mass function of 𝑌 , where 𝑞 𝑦 𝑗 = ℙ(𝑌 = 𝑦 𝑗 ) ▪ Properties: 0 ≤ 𝑞 𝑦 𝑗 ≤ 1 ෍ 𝑞(𝑦 𝑗 ) = 1 𝑦 𝑗 18

  19. PMF Examples ▪ Number of heads in tossing  Number rolled in rolling a fair die three coins 𝒚 𝒋 𝒒(𝒚 𝒋 ) 0 1/8 1 3/8 2 3/8 3 1/8 𝑞 𝑦 𝑗 = 1 8 + 3 8 + 3 8 + 1 ෍ 8 = 1 𝑞 𝑦 𝑗 = 1 6 + 1 6 + 1 6 + 1 6 + 1 6 + 1 𝑦 𝑗 ෍ 6 = 1 𝑦 𝑗 19

  20. Probability Density Function (PDF) ▪ X : continuous random variable ▪ f(x) : probability density function of X d  f ( x ) F ( x ) dx CDF of 𝑌 ▪ Note: — ℙ 𝑌 = 𝑦 = 0 !! — 𝑔 𝑦 ≠ ℙ 𝑌 = 𝑦 — ℙ 𝑦 ≤ 𝑌 ≤ 𝑦 + Δ𝑦 ≈ 𝑔 𝑦 Δx ▪ Properties: 𝑐 𝑔 𝑦 𝑒𝑦 — ℙ 𝑏 ≤ 𝑌 ≤ 𝑐 = ׬ 𝑏 +∞ 𝑔 𝑦 𝑒𝑦 = 1 — ׬ 20 −∞

  21. Probability Density Function Example: Life of an inspection device is given by 𝑌 , a continuous random variable with PDF: 2 𝑓 − 𝑦 1 𝑔 𝑦 = 2 , for 𝑦 ≥ 0 — 𝑌 has an exponential distribution with mean 2 years — Probability that the device’s life is between 2 and 3 years: 3 ℙ 2 ≤ 𝑌 ≤ 3 = 1 𝑓 −𝑦 2 න 2 𝑒𝑦 = 0.14 2 21

  22. Cumulative Distribution Function (CDF) ▪ X : discrete or continuous random variable ▪ 𝐺(𝑦) : cumulative probability distribution function of X, or simply, probability distribution function of X 𝐺 𝑦 = ℙ 𝑌 ≤ 𝑦 — If 𝑌 is discrete, then 𝐺 𝑦 = σ 𝑦 𝑗 ≤𝑦 𝑞 𝑦 𝑗 𝑦 𝑔 𝑢 𝑒𝑢 — If 𝑌 is continuous, then 𝐺 𝑦 = ׬ −∞ ▪ Properties — 𝐺(𝑦) is a non-decreasing function, i.e., if 𝑏 < 𝑐 , then 𝐺 𝑏 ≤ 𝐺(𝑐) 𝑦→+∞ 𝐺 𝑦 = 1 , and lim lim 𝑦→−∞ 𝐺 𝑦 = 0 — ▪ All probability questions about 𝑌 can be answered in terms of the CDF, e.g.: ℙ 𝑏 < 𝑌 ≤ 𝑐 = 𝐺 𝑐 − 𝐺 𝑏 , for all 𝑏 ≤ 𝑐 22

  23. Cumulative Distribution Function Discrete random variable example. ▪ Rolling a die, X is the number rolled — p(i) = ℙ (X = i ) = 1/6, 1 ≤ i ≤ 6 — F(i) = ℙ (X ≤ i) = p(1) + … + p(i) = i/6 23

Recommend


More recommend