intelligente systeme
play

Intelligente Systeme WS 18/19 Dr. Benjamin Guthier Professur fr - PowerPoint PPT Presentation

Intelligente Systeme WS 18/19 Dr. Benjamin Guthier Professur fr Bildverarbeitung Intelligente Systeme Dr. Benjamin Guthier 2. PROBABILISTIC MODELS Intelligente Systeme Dr. Benjamin Guthier Need for Probabilistic Reasoning Human


  1. Intelligente Systeme WS 18/19 Dr. Benjamin Guthier Professur für Bildverarbeitung Intelligente Systeme – Dr. Benjamin Guthier

  2. 2. PROBABILISTIC MODELS Intelligente Systeme – Dr. Benjamin Guthier

  3. Need for Probabilistic Reasoning • Human reasoning is based on uncertain evidence • In classical logic, conclusions are true or false – Does not account for uncertainty – Cannot handle conflicting evidence • Probability theory can model uncertainty in the real world Intelligente Systeme – Dr. Benjamin Guthier 3 | 2. Probabilistic Models

  4. PROBABILITY THEORY • Recommended reading: – S. Russel and P. Norvig, Artificial Intelligence: A Modern Approach. Chapter 13 “Quantifying Uncertainty” Intelligente Systeme – Dr. Benjamin Guthier 4 | 2. Probabilistic Models

  5. Random Variable • Random variable takes on values with a certain probability • Boolean random variables: Cavity (do I have a cavity?) • Discrete random variables: Weather – one of <sunny, rainy, cloudy, snow> – Values must be exhaustive and mutually exclusive • Construct propositions by assigning a value to a variable – Weather = sunny – Cavity = false – Complex propositions: Weather = sunny ∨ Cavity = false Intelligente Systeme – Dr. Benjamin Guthier 5 | 2. Probabilistic Models

  6. Events • Event: Complete specification of the state of the world about which the agent is uncertain • If the world consists of only two Boolean variables Cavity and Toothache , then there are 4 distinct events: – Cavity = true ∧ Toothache = false (short: cavity ∧ ¬toothache ) – Cavity = false ∧ Toothache = false – Cavity = true ∧ Toothache = true – Cavity = false ∧ Toothache = true Intelligente Systeme – Dr. Benjamin Guthier 6 | 2. Probabilistic Models

  7. Prior Probability Prior probability of event prior to arrival of any new evidence: • – 𝑄(Cavity = true) = 0.2 – 𝑄(Weather = sunny) = 0.72 Probability distribution: gives values for all possible assignments: • – 𝑸 Weather = < 0.72, 0.1, 0.08, 0.1 > (for sunny, rainy, cloudy, snow) – Must sum to 1 Joint probability distribution: gives the probabilities of all • combinations of events – 𝑸(Weather, Cavity) is a 2 × 4 matrix of values: Weather= sunny rainy cloudy snow Cavity=true 0.144 0.02 0.016 0.02 Cavity=false 0.576 0.08 0.064 0.08 Intelligente Systeme – Dr. Benjamin Guthier 7 | 2. Probabilistic Models

  8. A Note on Notation • Logical operators: ∧ (and), ∨ (or), ¬ (not) • Random variables uppercase: 𝑌, Weather, Cavity • Values of random variables lowercase: 𝑦, sunny, cavity, true • Lists of values in boldface or with < ⋯ > – List of probabilities: 𝑸 Weather = < 0.72, 0.1, 0.08, 0.1 > – List of variables: 𝒀 =< 𝑌 1 , 𝑌 2 , … , 𝑌 𝑜 > – Corresponding list of values: 𝒚 =< true, sunny, … , cavity > – List of equations: 𝑸 𝑌, 𝑍 = 𝑸 𝑌 𝑸(𝑍) (one for each combination of values of 𝑌 and 𝑍 ) Intelligente Systeme – Dr. Benjamin Guthier 8 | 2. Probabilistic Models

  9. Conditional Probability • Also called posterior probability – Probability after more information becomes available – Or: “Probability conditioned on a prior event” • 𝑄 Cavity = true Toothache = true = 0.8 – Probably of having a cavity, given that one has a toothache – In this case, higher than prior probability 𝑄(Cavity = true) = 0.2 • Posterior probability is often the answer in prob. reasoning – Chance of rain, given a cloudy sky and high pressure – Prob. of someone voting for a party, given age, gender, location Intelligente Systeme – Dr. Benjamin Guthier 9 | 2. Probabilistic Models

  10. Conditional Probability (2) 𝑄 𝐵∧𝐶 • Definition of conditional probability: 𝑄 𝐵 𝐶 = 𝑄(𝐶) • Or as an alternative formulation (product rule): 𝑄 𝐵 ∧ 𝐶 = 𝑄 𝐵 𝐶 𝑄(𝐶) Intelligente Systeme – Dr. Benjamin Guthier 10 | 2. Probabilistic Models

  11. Conditional Probability (Example) • A mother comes home and finds a broken flower vase ( b ). She suspects that her child did it ( c ). • The (incredibly smart) child argues: “But mom, the probability that on any given day, a child breaks a flower vase is only P c, b = 1: 3650 , i.e., it only happens like once every ten years! It’s pretty unlikely that it was me.” • The mother retorts: “Vases don’t just break. The probability of a vase breaking on any given day is only 𝑄 b = 0.034% .“ • Both calculate the probability of it being the child’s fault, given that the vase is already broken: 𝑄 c b = 𝑄(c, b) 𝑄(b) = 1: 3650 0.00034 = 80% Intelligente Systeme – Dr. Benjamin Guthier 11 | 2. Probabilistic Models

  12. Conditional Probability (Example) 𝑄 true = 1 𝑄 c, b =very small 𝑄 b =also small = 𝑄 c|b = large Intelligente Systeme – Dr. Benjamin Guthier 12 | 2. Probabilistic Models

  13. Numerical Random Variables • Sometimes we need to model numerical values • If the values are discrete, nothing changes • Number of leaves on a clover: 𝐷 ∈ 3, 4, 5 – 𝑸 𝐷 =< 0.999, 0.0008, 0.0002 > • Fair die: 𝐸 ∈ {1, 2, 3, 4, 5, 6} 1 1 1 1 1 1 – 𝑸 𝐸 =< 6 , 6 , 6 , 6 , 6 , 6 > Intelligente Systeme – Dr. Benjamin Guthier 13 | 2. Probabilistic Models

  14. Expected Value • Expected value (mean, average) of a numerical random variable: 𝐹 𝑌 = ෍ 𝑦 𝑗 𝑄(𝑌 = 𝑦 𝑗 ) 𝑗 – 𝑦 𝑗 are the values the variable can take on • Clover example: – 𝐹 𝐷 = 3 ∗ 0.999 + 4 ∗ 0.0008 + 5 ∗ 0.0002 ≈ 3 • Fair die: 1 1 – 𝐹 𝐸 = σ 𝑗 𝑦 𝑗 ∗ 6 = 6 1 + 2 + 3 + 4 + 5 + 6 = 3.5 Intelligente Systeme – Dr. Benjamin Guthier 14 | 2. Probabilistic Models

  15. Continuous Random Variables • Continuous random variables can take on an infinite number of different values – E.g., temperature in a room 𝑌 ∈ [18, 26] can be any value in interval • Give the probability of the variable being in a sub-interval – E.g., probability of temperature being between 18 and 26 is 1.0 – For any 4 degree interval, probability is 0.5 – Probability that the temperature is exactly 21.384 degrees is 0 Intelligente Systeme – Dr. Benjamin Guthier 15 | 2. Probabilistic Models

  16. Probability Distribution Function Probabilities of continuous variables are defined by a probability • distribution function (pdf) 𝑄(𝑦) 1 if 18 ≤ 𝑦 ≤ 26 – E.g., 𝑄 𝑦 = ൝ 8 0 otherwise Calculate probabilities by integrating over intervals • 𝑐 𝐺 𝑌 𝑏 ≤ 𝑌 ≤ 𝑐 = න 𝑄 𝑦 𝑒𝑦 𝑏 26 1 8 𝑒𝑦 = 1 1 න 4 8 24 18 20 22 24 26 Intelligente Systeme – Dr. Benjamin Guthier 16 | 2. Probabilistic Models

  17. Gaussian Distribution • Real variables are often centered around the average, and values are less likely, the further from the average they are – Example: Height of adult male. 177cm is average. 169cm and 185cm are still fairly common. 200cm is very rare. 𝜈 = 177 • Gaussian Distribution: 𝜏 = 8 𝜏 2𝜌 𝑓 − 𝑦−𝜈 2 1 𝑄 𝑦 = 2𝜏2 𝜈 is the mean 𝜏 is the standard deviation (from the mean) Intelligente Systeme – Dr. Benjamin Guthier 17 | 2. Probabilistic Models

  18. PROBABILISTIC INFERENCE Intelligente Systeme – Dr. Benjamin Guthier 18 | 2. Probabilistic Models

  19. Inference Using Joint Probability • Assume we have three random variables – Toothache : My tooth is hurting – Catch : The dentist’s hook catches on to something (possibly a cavity) – Cavity : My tooth has a cavity • Also assume the joint probability distribution is given 𝑢𝑝𝑝𝑢ℎ𝑏𝑑ℎ𝑓 ¬𝑢𝑝𝑝𝑢ℎ𝑏𝑑ℎ𝑓 𝑑𝑏𝑢𝑑ℎ ¬𝑑𝑏𝑢𝑑ℎ 𝑑𝑏𝑢𝑑ℎ ¬𝑑𝑏𝑢𝑑ℎ .108 .012 .072 .008 𝑑𝑏𝑤𝑗𝑢𝑧 .016 .064 .144 .576 ¬𝑑𝑏𝑤𝑗𝑢𝑧 Intelligente Systeme – Dr. Benjamin Guthier 19 | 2. Probabilistic Models

  20. Inference Using Joint Probability (2) 𝑢𝑝𝑝𝑢ℎ𝑏𝑑ℎ𝑓 ¬𝑢𝑝𝑝𝑢ℎ𝑏𝑑ℎ𝑓 𝑑𝑏𝑢𝑑ℎ ¬𝑑𝑏𝑢𝑑ℎ 𝑑𝑏𝑢𝑑ℎ ¬𝑑𝑏𝑢𝑑ℎ .108 .012 .072 .008 𝑑𝑏𝑤𝑗𝑢𝑧 .016 .064 .144 .576 ¬𝑑𝑏𝑤𝑗𝑢𝑧 • Calculate probabilities by summing up elements • Prior probability of having a cavity: – 𝑄 cavity = 0.108 + 0.012 + 0.072 + 0.008 = 0.2 • Prior probability of having a toothache: – 𝑄 toothache = 0.108 + 0.012 + 0.016 + 0.064 = 0.2 Intelligente Systeme – Dr. Benjamin Guthier 20 | 2. Probabilistic Models

  21. Inference Using Joint Probability (3) 𝑢𝑝𝑝𝑢ℎ𝑏𝑑ℎ𝑓 ¬𝑢𝑝𝑝𝑢ℎ𝑏𝑑ℎ𝑓 𝑑𝑏𝑢𝑑ℎ ¬𝑑𝑏𝑢𝑑ℎ 𝑑𝑏𝑢𝑑ℎ ¬𝑑𝑏𝑢𝑑ℎ .108 .012 .072 .008 𝑑𝑏𝑤𝑗𝑢𝑧 .016 .064 .144 .576 ¬𝑑𝑏𝑤𝑗𝑢𝑧 Also calculate conditional probabilities • If I have a toothache, what is the probability of having a cavity? • 𝑄 cavity ∧ toothache 0.108+0.012 – 𝑄 cavity|toothache = = = 0.6 𝑄(toothache) 0.2 What if additionally the dentist’s hook caught on? • 𝑄 cavity,toothache,catch 0.108 – 𝑄 cavity|toothache, catch = = 0.108+0.016 = 0.87 𝑄(toothache,catch) Intelligente Systeme – Dr. Benjamin Guthier 21 | 2. Probabilistic Models

Recommend


More recommend