introduction to artificial intelligence inference in
play

Introduction to Artificial Intelligence Inference in Bayesian - PowerPoint PPT Presentation

Introduction to Artificial Intelligence Inference in Bayesian networks Janyl Jumadinova September 28-30, 2016 Inference tasks Simple queries : compute posterior probabilities Optimal decisions : decision networks include utility


  1. Introduction to Artificial Intelligence Inference in Bayesian networks Janyl Jumadinova September 28-30, 2016

  2. Inference tasks ◮ Simple queries : compute posterior probabilities ◮ Optimal decisions : decision networks include utility information; probabilistic inference required for P ( outcome | action , evidence ) ◮ Value of information : which evidence to seek next? ◮ Sensitivity analysis : which probability values are most critical? 2/19

  3. Bayesian Networks ◮ T : The lecture started by 10 : 35 ◮ L : The lecturer arrives late ◮ R : The lecture concerns robots ◮ M : The lecturer is Masha ◮ S : It is sunny 3/19

  4. Bayesian Networks Let’s say we want to find: P ( R | T , S )? 4/19

  5. Bayesian Networks Let’s say we want to find: P ( R | T , S )? 4/19

  6. Bayesian Networks Let’s say we want to find: P ( RT , S )? 5/19

  7. Joint Probability: General Case 6/19

  8. Conditional probabilities by enumerating all matching entries in the joint are expensive: Exponential in the number of variables 7/19

  9. Winter?� (A) Rain? Sprinkler?� (C) (B) Wet�Grass? (D) Slippery�Road? (E) A B Θ B | A A C Θ C | A A Θ A true true . 2 true true . 8 true . 6 true false . 8 true false . 2 false . 4 false true . 75 false true . 1 false false . 25 false false . 9 B C D Θ D | B,C true true true . 95 C E Θ E | C true true false . 05 true false true . 9 true true . 7 true false false . 1 true false . 3 false true true . 8 false true 0 false true false . 2 false false 1 false false true 0 false false false 1 8/19

  10. Variable Elimination Pr ( D , E ) ? 9/19

  11. Variable Elimination Pr ( D , E ) ? ◮ We can sum out variables without having to construct the joint probability distribution explicitly. ◮ Variables can be summed out while keeping the original distribution, and all successive distributions, in factored form ( θ - prior probability) � P ( d , e ) = θ e | c θ d | bc θ c | a θ b | a θ a a , b , c 9/19

  12. Variable Elimination Pr ( D , E ) ? ◮ We can sum out variables without having to construct the joint probability distribution explicitly. ◮ Variables can be summed out while keeping the original distribution, and all successive distributions, in factored form ( θ - prior probability) � P ( d , e ) = θ e | c θ d | bc θ c | a θ b | a θ a a , b , c ◮ This allows the procedure to sometimes escape the exponential 9/19 complexity of the brute-force method.

  13. Variable elimination : carry out summations right-to-left, storing intermediate results (factors) to avoid recomputation Factors are matrices indexed by the values of its argument variables 10/19

  14. Variable Elimination Summing out a variable from a product of factors f : - move any constant factors outside the summation - add up submatrices in pointwise product of remaining factors � x f 1 × · · · × f k = f 1 × · · · × f i � x f i +1 × · · · × f k = f 1 × · · · × f i × f ¯ X assuming f 1 , . . . , f i do not depend on X Pointwise product of factors f 1 and f 2 : f 1 ( x 1 , . . . , x j , y 1 , . . . , y k ) × f 2 ( y 1 , . . . , y k , z 1 , . . . , z l ) = f ( x 1 , . . . , x j , y 1 , . . . , y k , z 1 , . . . , z l ) E.g., f 1 ( a , b ) × f 2 ( b , c ) = f ( a , b , c ) 11/19

  15. 12/19

  16. Review: Bayesian Inference Bayesian inference is about the quantification and propagation of uncertainty, defined via a probability, in light of observations of the system. From Prior → Posterior 13/19

  17. Review: Bayesian Inference Bayesian inference is about the quantification and propagation of uncertainty, defined via a probability, in light of observations of the system. From Prior → Posterior Reminder : A posterior probability is the probability of the event’s outcome given the data (observation). A prior probability is the probability of the event’s outcome before you collect the data (make observations). 13/19

  18. Bayesian Inference: determining posterior distributions in belief networks ◮ Exact inference by enumeration ◮ Exact inference by variable elimination ◮ Approximate inference by stochastic simulation 14/19

  19. Bayesian Inference ◮ Exact inference by variable elimination: - Exploit the structure of the network to eliminate (sum out) the non-observed, non-query variables one at a time 15/19

  20. Bayesian Inference ◮ Exact inference by variable elimination: - Exploit the structure of the network to eliminate (sum out) the non-observed, non-query variables one at a time - Finding an elimination ordering that results in the smallest tree-width is NP-hard 15/19

  21. Bayesian Inference ◮ Exact inference by variable elimination: - Exploit the structure of the network to eliminate (sum out) the non-observed, non-query variables one at a time - Finding an elimination ordering that results in the smallest tree-width is NP-hard ◮ Approximate inference by stochastic simulation 15/19

  22. Inference by stochastic simulation Basic Idea: 1. Draw N samples from a sampling distribution 2. Compute an approximate posterior probability 3. Show this converges to the true probability 16/19

  23. 17/19

  24. Inference by stochastic simulation ◮ Rejection sampling : reject samples disagreeing with evidence 18/19

  25. Inference by stochastic simulation ◮ Rejection sampling : reject samples disagreeing with evidence ◮ Likelihood weighting : use evidence to weight samples 18/19

  26. Inference by stochastic simulation ◮ Rejection sampling : reject samples disagreeing with evidence ◮ Likelihood weighting : use evidence to weight samples ◮ Markov chain Monte Carlo (MCMC) : sample from a stochastic process whose stationary distribution is the true posterior 18/19

  27. Bayesian Inference Class Exercise 1. Find Netlogo models titled Bayes1D , Drift , Spatial and Monte Carlo Pi in the “cs370f2016-share/in-class/sep30 BaysianNetlogoModels” directory, and explore them in this order. 2. For the first three models, (in the Google form) in your own words give a 1-2 sentence description of the model and comment on how it uses Bayesian inference. 3. For the last model (Monte Carlo), complete the documentation inside the Netlogo’s “Info” tab and submit the updated model to your own repository. 19/19

Recommend


More recommend