introduction to artificial intelligence bayesian networks
play

Introduction to Artificial Intelligence Bayesian Networks Janyl - PowerPoint PPT Presentation

Introduction to Artificial Intelligence Bayesian Networks Janyl Jumadinova September 26, 2016 Bayesian Networks A simple, graphical notation for conditional independence assertions 2/14 Bayesian Networks A simple, graphical notation


  1. Introduction to Artificial Intelligence Bayesian Networks Janyl Jumadinova September 26, 2016

  2. Bayesian Networks ◮ A simple, graphical notation for conditional independence assertions 2/14

  3. Bayesian Networks ◮ A simple, graphical notation for conditional independence assertions ◮ Syntax: - a set of nodes, one per variable - a directed, acyclic graph (link ≈ “directly influences”) - a conditional distribution for each node given its parents: P ( X i | Parents ( X i )) 2/14

  4. Bayesian Networks ◮ A simple, graphical notation for conditional independence assertions ◮ Syntax: - a set of nodes, one per variable - a directed, acyclic graph (link ≈ “directly influences”) - a conditional distribution for each node given its parents: P ( X i | Parents ( X i )) ◮ In the simplest case, conditional distribution represented as a conditional probability table (CPT) giving the distribution over X i for each combination of parent values 2/14

  5. Bayesian Networks: Uses 3/14

  6. Example ◮ T : The lecture started by 10 : 35 ◮ L : The lecturer arrives late ◮ R : The lecture concerns robots ◮ M : The lecturer is Masha ◮ S : It is sunny 4/14

  7. Example ◮ T : The lecture started by 10 : 35 ◮ L : The lecturer arrives late ◮ R : The lecture concerns robots ◮ M : The lecturer is Masha ◮ S : It is sunny ◮ T only directly influenced by L (i.e. T is conditionally independent of R , M , S given L ) 4/14

  8. Example ◮ T : The lecture started by 10 : 35 ◮ L : The lecturer arrives late ◮ R : The lecture concerns robots ◮ M : The lecturer is Masha ◮ S : It is sunny ◮ T only directly influenced by L (i.e. T is conditionally independent of R , M , S given L ) ◮ L only directly influenced by M and S (i.e. L is conditionally independent of R given M and S ) 4/14

  9. Example ◮ T : The lecture started by 10 : 35 ◮ L : The lecturer arrives late ◮ R : The lecture concerns robots ◮ M : The lecturer is Masha ◮ S : It is sunny ◮ T only directly influenced by L (i.e. T is conditionally independent of R , M , S given L ) ◮ L only directly influenced by M and S (i.e. L is conditionally independent of R given M and S ) ◮ R only directly influenced by M (i.e. R is conditionally independent of L , S , given M ) 4/14

  10. Example ◮ T : The lecture started by 10 : 35 ◮ L : The lecturer arrives late ◮ R : The lecture concerns robots ◮ M : The lecturer is Masha ◮ S : It is sunny ◮ T only directly influenced by L (i.e. T is conditionally independent of R , M , S given L ) ◮ L only directly influenced by M and S (i.e. L is conditionally independent of R given M and S ) ◮ R only directly influenced by M (i.e. R is conditionally independent of L , S , given M ) 4/14 ◮ M and S are independent

  11. Making a Bayes net ◮ T : The lecture started by 10 : 35 ◮ L : The lecturer arrives late ◮ R : The lecture concerns robots ◮ M : The lecturer is Masha ◮ S : It is sunny 5/14

  12. Making a Bayes net ◮ T : The lecture started by 10 : 35 ◮ L : The lecturer arrives late ◮ R : The lecture concerns robots ◮ M : The lecturer is Masha ◮ S : It is sunny ◮ Step one : add variables ◮ Just choose the variables you’d like to be included in the net 5/14

  13. Making a Bayes net ◮ T : The lecture started by 10 : 35 ◮ L : The lecturer arrives late ◮ R : The lecture concerns robots ◮ M : The lecturer is Masha ◮ S : It is sunny 6/14

  14. Making a Bayes net ◮ T : The lecture started by 10 : 35 ◮ L : The lecturer arrives late ◮ R : The lecture concerns robots ◮ M : The lecturer is Masha ◮ S : It is sunny ◮ Step two : add links ◮ The link structure must be acyclic. 6/14

  15. Making a Bayes net ◮ T : The lecture started by 10 : 35 ◮ L : The lecturer arrives late ◮ R : The lecture concerns robots ◮ M : The lecturer is Masha ◮ S : It is sunny ◮ Step three : add a probability table for each node ◮ The table for node X must list P ( X | ParentValues ) for each possible combination of parent values. 7/14

  16. Conditional Probability ◮ Two unconnected variables may still be correlated. 8/14

  17. Conditional Probability ◮ Two unconnected variables may still be correlated. ◮ Each node is conditionally independent of all non-descendants in the tree, given its parents. 8/14

  18. Conditional Probability ◮ Two unconnected variables may still be correlated. ◮ Each node is conditionally independent of all non-descendants in the tree, given its parents. ◮ You can deduce many other conditional independence relations from a Bayes net. 8/14

  19. Bayes Net Formalized ◮ A Bayes net (also called a belief network) is an augmented directed acyclic graph, represented by the pair V , E where: V is a set of vertices E is a set of directed edges joining vertices. No loops of any length are allowed 9/14

  20. Bayes Net Formalized ◮ A Bayes net (also called a belief network) is an augmented directed acyclic graph, represented by the pair V , E where: V is a set of vertices E is a set of directed edges joining vertices. No loops of any length are allowed ◮ Each vertex in V contains the following information: - The name of a random variable - A probability distribution table indicating how the probability of this variable’s values depends on all possible combinations of parental values. 9/14

  21. Building a Bayes Net 1. Choose a set of relevant variables 10/14

  22. Building a Bayes Net 1. Choose a set of relevant variables 2. Choose an ordering for them 10/14

  23. Building a Bayes Net 1. Choose a set of relevant variables 2. Choose an ordering for them 3. Assume they are called X 1 .. X m (where X 1 is the first in the ordering, X 1 is the second, etc.) 10/14

  24. Building a Bayes Net 1. Choose a set of relevant variables 2. Choose an ordering for them 3. Assume they are called X 1 .. X m (where X 1 is the first in the ordering, X 1 is the second, etc.) 4. For i = 1 to m : 10/14

  25. Building a Bayes Net 1. Choose a set of relevant variables 2. Choose an ordering for them 3. Assume they are called X 1 .. X m (where X 1 is the first in the ordering, X 1 is the second, etc.) 4. For i = 1 to m : 4.1 Add the X i node to the network 10/14

  26. Building a Bayes Net 1. Choose a set of relevant variables 2. Choose an ordering for them 3. Assume they are called X 1 .. X m (where X 1 is the first in the ordering, X 1 is the second, etc.) 4. For i = 1 to m : 4.1 Add the X i node to the network 4.2 Set Parents ( X i ) to be a minimal subset of { X 1 , ..., X i − 1 } such that we have conditional independence of X i and all other members of { X 1 , ..., X i − 1 } given Parents ( X i ) 10/14

  27. Building a Bayes Net 1. Choose a set of relevant variables 2. Choose an ordering for them 3. Assume they are called X 1 .. X m (where X 1 is the first in the ordering, X 1 is the second, etc.) 4. For i = 1 to m : 4.1 Add the X i node to the network 4.2 Set Parents ( X i ) to be a minimal subset of { X 1 , ..., X i − 1 } such that we have conditional independence of X i and all other members of { X 1 , ..., X i − 1 } given Parents ( X i ) 4.3 Define the probability table of P ( X i = k | Assignments of Parents ( X i ) ) 10/14

  28. Computing a Joint Entry ◮ T : The lecture started by 10 : 35 ◮ L : The lecturer arrives late ◮ R : The lecture concerns robots ◮ M : The lecturer is Masha ◮ S : It is sunny 11/14

  29. Computing a Joint Entry ◮ T : The lecture started by 10 : 35 ◮ L : The lecturer arrives late ◮ R : The lecture concerns robots ◮ M : The lecturer is Masha ◮ S : It is sunny 11/14

  30. General Case Any entry in joint distribution table can be computed. And so any 12/14 conditional probability can be computed.

  31. Bayes nets so far... ◮ We have a methodology for building Bayes nets. ◮ We don’t require exponential storage to hold our probability table. Only exponential in the maximum number of parents of any node. ◮ We can compute probabilities of any given assignment of truth values to the variables. And we can do it in time linear with the number of nodes. ◮ So we can also compute answers to any questions. 13/14

  32. Example ◮ Problem : when somebody reports people leaving a building because a fire alarm went off, did it go off because of tampering or is there really a fire? 14/14

  33. Example ◮ Problem : when somebody reports people leaving a building because a fire alarm went off, did it go off because of tampering or is there really a fire? ◮ Variables: Tampering , Fire , Alarm , Smoke , Leaving , Report 14/14

  34. Example ◮ Problem : when somebody reports people leaving a building because a fire alarm went off, did it go off because of tampering or is there really a fire? ◮ Variables: Tampering , Fire , Alarm , Smoke , Leaving , Report Network topology reflects “causal” knowledge: ◮ A tampering can set the alarm off ◮ A fire can set the alarm off ◮ The alarm causes people to leave the building 14/14

Recommend


More recommend