cs325 artificial intelligence ch 14b probabilistic
play

CS325 Artificial Intelligence Ch 14b Probabilistic Inference - PowerPoint PPT Presentation

CS325 Artificial Intelligence Ch 14b Probabilistic Inference Cengiz Gnay Spring 2013 Gnay Ch 14b Probabilistic Inference Inference tasks Simple queries: compute posterior marginal P ( X i | E = e ) Conjunctive queries: P ( X i , X


  1. CS325 Artificial Intelligence Ch 14b – Probabilistic Inference Cengiz Günay Spring 2013 Günay Ch 14b – Probabilistic Inference

  2. Inference tasks Simple queries: compute posterior marginal P ( X i | E = e ) Conjunctive queries: P ( X i , X j | E = e ) = P ( X i | E = e ) P ( X j | X i , E = e ) Optimal decisions: decision networks include utility information; probabilistic inference required for P ( outcome | action , evidence ) Value of information: which evidence to seek next? Sensitivity analysis: which probability values are most critical? Explanation: why do I need a new starter motor? Günay Ch 14b – Probabilistic Inference

  3. Inference by Enumeration With no dependency information, we need 2 n entries in joint dist.: Cavity toothache toothache L ? catch catch catch catch L L .108 .012 .072 .008 cavity Toothache Catch .016 .064 .144 .576 cavity L Günay Ch 14b – Probabilistic Inference

  4. Inference by Enumeration With no dependency information, we need 2 n entries in joint dist.: Cavity toothache toothache L ? catch catch catch catch L L .108 .012 .072 .008 cavity Toothache Catch cavity .016 .064 .144 .576 L For any proposition φ , sum the events where it is true: P ( φ ) = � = φ P ( w ) w : w | Günay Ch 14b – Probabilistic Inference

  5. Inference by Enumeration With no dependency information, we need 2 n entries in joint dist.: Cavity toothache toothache L ? catch catch catch catch L L cavity .108 .012 .072 .008 Toothache Catch .016 .064 .144 .576 cavity L For any proposition φ , sum the events where it is true: P ( φ ) = � = φ P ( w ) w : w | P ( toothache ) = 0 . 108 + 0 . 012 + 0 . 016 + 0 . 064 = 0 . 2 Günay Ch 14b – Probabilistic Inference

  6. Inference by Enumeration With no dependency information, we need 2 n entries in joint dist.: Cavity toothache toothache L ? catch catch catch catch L L cavity .108 .012 .072 .008 Toothache Catch .016 .064 .144 .576 cavity L For any proposition φ , sum the events where it is true: P ( φ ) = � = φ P ( w ) w : w | P ( cavity ∨ toothache ) =? Günay Ch 14b – Probabilistic Inference

  7. Inference by Enumeration With no dependency information, we need 2 n entries in joint dist.: Cavity toothache toothache L ? catch catch catch catch L L .108 .012 .072 .008 cavity Toothache Catch .016 .064 .144 .576 cavity L For any proposition φ , sum the events where it is true: P ( φ ) = � = φ P ( w ) w : w | P ( cavity ∨ toothache ) = 0 . 108 + 0 . 012 + 0 . 072 + 0 . 008 + 0 . 016 + 0 . 064 = 0 . 28 Günay Ch 14b – Probabilistic Inference

  8. Inference by Enumeration With no dependency information, we need 2 n entries in joint dist.: Cavity toothache toothache L ? catch catch catch catch L L cavity .108 .012 .072 .008 Toothache Catch cavity .016 .064 .144 .576 L Can also compute conditional probabilities: P ( ¬ cavity | toothache ) = ? Günay Ch 14b – Probabilistic Inference

  9. Inference by Enumeration With no dependency information, we need 2 n entries in joint dist.: Cavity toothache toothache L ? catch catch catch catch L L .108 .012 .072 .008 cavity Toothache Catch .016 .064 .144 .576 cavity L Can also compute conditional probabilities: P ( ¬ cavity ∧ toothache ) P ( ¬ cavity | toothache ) = P ( toothache ) 0 . 016 + 0 . 064 = 0 . 108 + 0 . 012 + 0 . 016 + 0 . 064 = 0 . 4 Günay Ch 14b – Probabilistic Inference

  10. Joint probability with known dependencies Cavity Toothache Catch P ( tootache , catch , cavity ) = P ( tootache | cavity ) P ( catch | cavity ) P ( cavity ) In general, P ( x 1 , . . . , x n ) = � n i = 1 P ( x i | parents ( X i )) Günay Ch 14b – Probabilistic Inference

  11. Joint probability with known dependencies Cavity Toothache Catch P ( tootache , catch , cavity ) = P ( tootache | cavity ) P ( catch | cavity ) P ( cavity ) In general, P ( x 1 , . . . , x n ) = � n i = 1 P ( x i | parents ( X i )) Günay Ch 14b – Probabilistic Inference

  12. Burglary or Earthquake: inference from joint P(E) P(B) Burglary Earthquake .001 .002 B E P(A|B,E) T T .95 Alarm T F .94 F T .29 F F .001 A P(J|A) A P(M|A) T JohnCalls .90 .70 MaryCalls T F .05 F .01 P ( j ∧ m ∧ a ∧ ¬ b ∧ ¬ e ) = ? Günay Ch 14b – Probabilistic Inference

  13. Burglary or Earthquake: inference from joint P(E) P(B) Burglary Earthquake .001 .002 B E P(A|B,E) T T .95 Alarm T F .94 F T .29 F F .001 A P(J|A) A P(M|A) T JohnCalls .90 .70 MaryCalls T F .05 F .01 P ( j ∧ m ∧ a ∧ ¬ b ∧ ¬ e ) = P ( j | a ) P ( m | a ) P ( a |¬ b , ¬ e ) P ( ¬ b ) P ( ¬ e ) = 0 . 9 × 0 . 7 × 0 . 001 × 0 . 999 × 0 . 998 ≈ 0 . 00063 Günay Ch 14b – Probabilistic Inference

  14. Burglary or Earthquake: inference by enumeration P(B) P(E) Burglary Earthquake .001 .002 B E P(A|B,E) T T .95 Alarm T F .94 F T .29 F F .001 A P(J|A) A P(M|A) JohnCalls T .90 MaryCalls T .70 F .05 .01 F P ( B | j , m ) = P ( B , j , m ) / P ( j , m ) = α P ( B , j , m ) = α � � a P ( B , j , m , e , a ) e Günay Ch 14b – Probabilistic Inference

  15. Burglary or Earthquake: inference by enumeration P(B) P(E) Burglary Earthquake .002 .001 B E P(A|B,E) T T .95 Alarm T F .94 F T .29 F F .001 A P(J|A) A P(M|A) T .90 JohnCalls MaryCalls .70 T F .05 F .01 P ( B | j , m ) = α � � a P ( B ) P ( e ) P ( a | B , e ) P ( j | a ) P ( m | a ) e = α P ( B ) � e P ( e ) � a P ( a | B , e ) P ( j | a ) P ( m | a ) Günay Ch 14b – Probabilistic Inference

  16. Burglary or Earthquake: inference by enumeration P(E) P(B) Burglary Earthquake .001 .002 B E P(A|B,E) T T .95 Alarm T F .94 F T .29 F F .001 A P(J|A) A P(M|A) T JohnCalls .90 MaryCalls T .70 F .05 .01 F joining and elimination Günay Ch 14b – Probabilistic Inference

  17. What if we cannot infer exactly? Exact inference is expensive What else can we do? Observe random events and record outcomes to approximate probabilities Also called a Monte Carlo method With ∞ samples, it is consistent Rejection sampling: for rare events Likelihood weighing: to avoid inconsistency Gibbs sampling: random walk through state space Monty Hall letter Günay Ch 14b – Probabilistic Inference

  18. What if we cannot infer exactly? Exact inference is expensive What else can we do? Observe random events and record outcomes to approximate probabilities Also called a Monte Carlo method With ∞ samples, it is consistent Rejection sampling: for rare events Likelihood weighing: to avoid inconsistency Gibbs sampling: random walk through state space Monty Hall letter Günay Ch 14b – Probabilistic Inference

  19. What if we cannot infer exactly? Exact inference is expensive What else can we do? Observe random events and record outcomes to approximate probabilities Also called a Monte Carlo method With ∞ samples, it is consistent Rejection sampling: for rare events Likelihood weighing: to avoid inconsistency Gibbs sampling: random walk through state space Monty Hall letter Günay Ch 14b – Probabilistic Inference

Recommend


More recommend