graphical models graphical models
play

Graphical Models Graphical Models Conditional Independence 1 - PowerPoint PPT Presentation

Conditional Independence d-Separation Belief Propogation Conditional Independence d-Separation Belief Propogation Graphical Models Graphical Models Conditional Independence 1 Steven J Zeil d-Separation 2 Old Dominion Univ. Fall 2010


  1. Conditional Independence d-Separation Belief Propogation Conditional Independence d-Separation Belief Propogation Graphical Models Graphical Models Conditional Independence 1 Steven J Zeil d-Separation 2 Old Dominion Univ. Fall 2010 Belief Propogation 3 1 2 Conditional Independence d-Separation Belief Propogation Conditional Independence d-Separation Belief Propogation Graphical Models Example a.k.a. Bayesian networks, probabilistic networks Knowing that the grass is wet, what is the probability that rain is the cause? Nodes are hypotheses (random vars) Values are the probabilities of the observed value of that P ( W | R ) P ( R ) variable P ( R | W ) = P ( W ) Arcs are direct influences between hypotheses P ( W | R ) P ( R ) = Forms a directed acyclic graph (DAG) P ( W | R ) P ( R ) + P ( W |¬ R )( P ( ¬ R ) The parameters are the conditional probabilities in the arcs 0 . 9 × 0 . 4 = 0 . 9 × 0 . 4 + 0 . 2 × 0 . 6) = 0 . 75 3 4

  2. Conditional Independence d-Separation Belief Propogation Conditional Independence d-Separation Belief Propogation Causes & Diagnoses Conditional Independence X and Y are independent if Graph shows a causal relationship. P ( X , Y ) = P ( X ) P ( Y ) Bayes rules “reverses” the arc to give a diagnosis. X and Y are conditionally independent given Z if P ( X , Y | Z ) = P ( X | Z ) P ( Y | Z ) or P ( X | Y , Z ) = P ( X | Z ) Three canonical cases: Head-to-tail, Tail-to-tail, head-to-head 5 6 Conditional Independence d-Separation Belief Propogation Conditional Independence d-Separation Belief Propogation Head-to-Tail Blocking P ( X , Y , Z ) = P ( X ) P ( Y | X ) P ( Z | Y ) If we know the state of Y, we know everything we can about Z without knowingthe state of X. P ( W | C ) = P ( W | R ) P ( R | C ) + P ( W |¬ R ) P ( ¬ R | C ) We say that Y blocks the path from X to Z or, Y separates X and Z. 7 8

  3. Conditional Independence d-Separation Belief Propogation Conditional Independence d-Separation Belief Propogation Tail-to-Tail Head-to-Head P ( X , Y , Z ) = P ( X ) P ( Y | X ) P ( Z | X ) P ( X , Y , Z ) = P ( X ) P ( Y ) P ( Z | X , Y ) An observed X blocks the path between Y and Z: Z blocks the path between X and Y when it is unobserved . P ( X , Y , Z ) P ( X , Y | X ) = P ( X ) P ( X ) P ( Y | X ) P ( Z | X ) = P ( X ) = P ( Y | X ) P ( Z | X ) 9 10 Conditional Independence d-Separation Belief Propogation Conditional Independence d-Separation Belief Propogation Causal vs Diagnostic Explaining Away Causal inference : If the sprinkler is on, what is the probability that the grass is Suppose that we know that it rained: wet? ( P ( W | S )) P ( W | R , S ) P ( S | R ) Diagnostic inference : If the grass is wet, P ( S | R , W ) = P ( W | R what is the probability that the sprinkler P ( W | R , S ) P ( S ) is on? = P ( W | R P ( S , W ) = P ( W | S ) P ( S ) = 0 . 21 = 0 . 35 P ( W ) Note that P ( S | R , W ) < P ( S | W ). Explaining Away : Knowing that it rained, the prob that the sprinkler caused the wet grass is decreased. 11 12

  4. Conditional Independence d-Separation Belief Propogation Conditional Independence d-Separation Belief Propogation Larger Systems Example: Classification Larger systems formed by Causal relation P ( x | C ) combining the three basic Bayes’ rule inverts subgraphs P ( C | x ) = p ( x | C ) P ( C ) Provides a structure & p ( x ) explanation of complicated relationships This graph describes P ( C , S , R , W , F ) How would you compute P ( F | C )? 13 14 Conditional Independence d-Separation Belief Propogation Conditional Independence d-Separation Belief Propogation Example: Naive Bayes Classification Example: Hidden Markov Given C, the x j are independent P ( � x | C ) = p ( x 1 | C ) p ( x 2 | C ) . . . p ( x d | C ) State at time t depends only on state at time t − 1 Output depends only on the current state 15 16

  5. Conditional Independence d-Separation Belief Propogation Conditional Independence d-Separation Belief Propogation Path Blocking d-Separation A path from node A to node B is If all paths from A to B are blocked blocked given { C } if given C, A and B are d-separated The directions of edges on the path (conditionally independent) given C. meet head-to-tail (case 1) or tail-to-tail (case 2) at a node in C, or The directions of edges meet head-to-head (case 3) and neither that node nor any of its descendants is in C. Examples BCDF is blocked given C BEFG is blocked given E or F BEFD is blocked unless F (or G) is given 17 18 Conditional Independence d-Separation Belief Propogation Conditional Independence d-Separation Belief Propogation Belief Propogation Chains Use graph-based algorithms to answer queries of the form P ( X | E ) where The query node X is any node in the graph E is a set of evidence nodes whose values are known Evidence E + in ancestors of X will flow along as diagnostic inference Evidence E − in decendents of X will flow back as causal inference E + and E − separate X from any more nodes in the chain, so we have at most two evidence nodes to consider 19 20

  6. Conditional Independence d-Separation Belief Propogation Conditional Independence d-Separation Belief Propogation Chains: Propogated Info Chains: Meeting at the Middle P ( E | X ) P ( X ) For each node N, P ( X | E ) = P ( E ) λ ( N ) = P ( E − | N ) P ( E + , E − | X ) P ( X ) π ( N ) = P ( N | E + ) = P ( E ) P ( E + | X ) P ( E − | X ) P ( X ) = P ( E ) P ( X | E + ) P ( E + ) P ( E − | X ) P ( X ) = P ( X ) P ( E ) α P ( X | E + ) P ( E − | X ) = = απ ( X ) λ ( X ) 21 22 Conditional Independence d-Separation Belief Propogation Conditional Independence d-Separation Belief Propogation Chains: Updating Trees P ( E − λ ( X ) = X | X ) = λ Y ( X ) λ Z ( X ) � λ X ( U ) = λ ( X ) P ( X | U ) P ( X | E ) = απ ( X ) λ ( X ) X P ( X | E + π ( X ) = X ) � π ( X ) = P ( X | U ) π ( U ) � = P ( X | U ) π X ( U ) U � λ ( X ) = P ( Y | X ) λ ( Y ) U π Y ( X ) = αλ Z ( X ) π ( X ) Y 23 24

  7. Conditional Independence d-Separation Belief Propogation Conditional Independence d-Separation Belief Propogation Polytrees Polytrees P ( X | E + π ( X ) = X ) � � � λ X ( U i ) = β λ ( X ) P ( X | U 1 , U 2 , . . . , U k ) π X ( U r ) k � � � � = P ( X | U 1 , U 2 , . . . , U k ) π X ( U i ) . . . X r � = i r � = i m U 1 U 2 U k j =1 � λ ( X ) = λ Y j ( X ) � π Y j ( X ) = α λ Y s ( X ) π ( X ) j =1 x � = j 25 26 Conditional Independence d-Separation Belief Propogation Junction Trees If X does not separate E + and E − (e.g., loops in dependencies) Moralize the graph by joining all nodes that have common children Identify cliques Embed cliques into single nodes to form a junction tree Each compressed node is a separately solvable subproblem 27

Recommend


More recommend