lifted inference in statistical relational models
play

Lifted Inference in Statistical Relational Models Guy Van den - PowerPoint PPT Presentation

Lifted Inference in Statistical Relational Models Guy Van den Broeck BUDA Invited Tutorial June 22 nd 2014 Overview 1. What are statistical relational models? 2. What is lifted inference? 3. How does lifted inference work? 4. Theoretical


  1. Lifted Algorithms (in the AI community) ● Exact Probabilistic Inference – First-Order Variable Elimination [Poole-IJCAI03, Braz-IJCAI05, Milch-AAAI08, Taghipour-JAIR13] – First-Order Knowledge Compilation [VdB-IJCAI11, VdB-NIPS11, VdB-AAAI12, VdB-Thesis13] – Probabilistic Theorem Proving [Gogate-UAI11] ● Approximate Probabilistic Inference – Lifted Belief Propagation [Jaimovich-UAI07, Singla-AAAI08, Kersting-UAI09] – Lifted Bisimulation/Mini-buckets [Sen-VLDB08, Sen-UAI09] – Lifted Importance Sampling [Gogate-UAI11, Gogate-AAAI12] – Lifted Relax, Compensate & Recover (Generalized BP) [VdB-UAI12] – Lifted MCMC [Niepert-UAI12, Niepert-AAAI13, Venugopal-NIPS12] – Lifted Variational Inference [Choi-UAI12, Bui-StarAI12] – Lifted MAP-LP [Mladenov-AISTATS14, Apsel-AAAI14] ● Special-Purpose Inference: – Lifted Kalman Filter [Ahmadi-IJCAI11, Choi-IJCAI11] – Lifted Linear Programming [Mladenov-AISTATS12]

  2. Assembly Language for Lifted Probabilistic Inference Computing conditional probabilities with: – Parfactor graphs – Markov logic networks – Probabilistic datalog/logic programs – Probabilistic databases – Relational Bayesian networks All reduces to weighted (first-order) model counting [VdB-IJCAI11, Gogate-UAI11, VdB-KR14, Gribkoff-UAI14]

  3. Weighted First-Order Model Counting A vocabulary Friends(Alice,Bob) Friends(Bob,Alice) Smokes(Alice) Smokes(Bob) Possible worlds Logical interpretations

  4. Weighted First-Order Model Counting A logical theory Friends(Alice,Bob) Friends(Bob,Alice) Smokes(Alice) ∀ x,y, Smokes(x) ∧ Friends(x,y) ⇒ Smokes(y) Smokes(Bob) Interpretations that satisfy the theory Models

  5. Weighted First-Order Model Counting A logical theory Friends(Alice,Bob) Friends(Bob,Alice) Smokes(Alice) ∀ x,y, Smokes(x) ∧ Friends(x,y) ⇒ Smokes(y) Smokes(Bob) ∑ First-order model count ~ #SAT

  6. Weighted First-Order Model Counting A logical theory and a weight function for predicates Friends(Alice,Bob) Friends(Bob,Alice) Smokes → 1 Smokes(Alice) Smokes(Bob) ¬Smokes → 2 Friends → 4 ¬Friends → 1

  7. Weighted First-Order Model Counting A logical theory and a weight function for predicates Friends(Alice,Bob) Friends(Bob,Alice) Smokes → 1 Smokes(Alice) Smokes(Bob) ¬Smokes → 2 Friends → 4 ¬Friends → 1 Weighted first-order model count ∑ ~ Partition function

  8. Example: First-Order Model Counting 1. Logical sentence Domain Alice ⇒ Smokes(Alice) Stress(Alice)

  9. Example: First-Order Model Counting 1. Logical sentence Domain Alice ⇒ Smokes(Alice) Stress(Alice) Stress(Alice) Smokes(Alice) Formula 0 0 1 0 1 1 1 0 0 1 1 1

  10. Example: First-Order Model Counting 1. Logical sentence Domain Alice ⇒ Smokes(Alice) Stress(Alice) → 3 models

  11. Example: First-Order Model Counting 1. Logical sentence Domain Alice ⇒ Smokes(Alice) Stress(Alice) → 3 models 2. Logical sentence Domain Alice ∀ x, Stress(x) ⇒ Smokes(x)

  12. Example: First-Order Model Counting 1. Logical sentence Domain Alice ⇒ Smokes(Alice) Stress(Alice) → 3 models 2. Logical sentence Domain Alice ∀ x, Stress(x) ⇒ Smokes(x) → 3 models

  13. Example: First-Order Model Counting 2. Logical sentence Domain Alice ∀ x, Stress(x) ⇒ Smokes(x) → 3 models

  14. Example: First-Order Model Counting 2. Logical sentence Domain Alice ∀ x, Stress(x) ⇒ Smokes(x) → 3 models 3. Logical sentence Domain n people ∀ x, Stress(x) ⇒ Smokes(x)

  15. Example: First-Order Model Counting 2. Logical sentence Domain Alice ∀ x, Stress(x) ⇒ Smokes(x) → 3 models 3. Logical sentence Domain n people ∀ x, Stress(x) ⇒ Smokes(x) → 3 n models

  16. Example: First-Order Model Counting 3. Logical sentence Domain n people ∀ x, Stress(x) ⇒ Smokes(x) → 3 n models

  17. Example: First-Order Model Counting 3. Logical sentence Domain n people ∀ x, Stress(x) ⇒ Smokes(x) → 3 n models 4. Logical sentence Domain n people ∀ y, ParentOf(y) ∧ Female ⇒ MotherOf(y)

  18. Example: First-Order Model Counting 3. Logical sentence Domain n people ∀ x, Stress(x) ⇒ Smokes(x) → 3 n models 4. Logical sentence Domain n people ∀ y, ParentOf(y) ∧ Female ⇒ MotherOf(y) if Female: ∀ y, ParentOf(y) ⇒ MotherOf(y)

  19. Example: First-Order Model Counting 3. Logical sentence Domain n people ∀ x, Stress(x) ⇒ Smokes(x) → 3 n models 4. Logical sentence Domain n people ∀ y, ParentOf(y) ∧ Female ⇒ MotherOf(y) if not Female: True

  20. Example: First-Order Model Counting 3. Logical sentence Domain n people ∀ x, Stress(x) ⇒ Smokes(x) → 3 n models 4. Logical sentence Domain n people ∀ y, ParentOf(y) ∧ Female ⇒ MotherOf(y) → (3 n +4 n ) models

  21. Example: First-Order Model Counting 4. Logical sentence Domain n people ∀ y, ParentOf(y) ∧ Female ⇒ MotherOf(y) → (3 n +4 n ) models

  22. Example: First-Order Model Counting 4. Logical sentence Domain n people n people ∀ y, ParentOf(y) ∧ Female ⇒ MotherOf(y) → (3 n +4 n ) models 5. Logical sentence Domain ∀ x,y, ParentOf(x,y) ∧ Female(x) ⇒ MotherOf(x,y) n people

  23. Example: First-Order Model Counting 4. Logical sentence Domain n people n people ∀ y, ParentOf(y) ∧ Female ⇒ MotherOf(y) → (3 n +4 n ) models 5. Logical sentence Domain ∀ x,y, ParentOf(x,y) ∧ Female(x) ⇒ MotherOf(x,y) n people n models → (3 n +4 n )

  24. Example: First-Order Model Counting 6. Logical sentence Domain n people ∀ x,y, Smokes(x) ∧ Friends(x,y) ⇒ Smokes(y)

  25. Example: First-Order Model Counting 6. Logical sentence Domain n people ∀ x,y, Smokes(x) ∧ Friends(x,y) ⇒ Smokes(y) ● If we know precisely who smokes, and there are k smokers

  26. Example: First-Order Model Counting 6. Logical sentence Domain n people ∀ x,y, Smokes(x) ∧ Friends(x,y) ⇒ Smokes(y) ● If we know precisely who smokes, and there are k smokers Database: Smokes(Alice) = 1 k k ? Smokes(Bob) = 0 Smokes(Charlie) = 0 Smokes(Dave) = 1 Smokes(Eve) = 0 n-k n-k ...

  27. Example: First-Order Model Counting 6. Logical sentence Domain n people ∀ x,y, Smokes(x) ∧ Friends(x,y) ⇒ Smokes(y) ● If we know precisely who smokes, and there are k smokers Database: Smokes(Alice) = 1 k k Smokes(Bob) = 0 Smokes(Charlie) = 0 Smokes(Dave) = 1 Smokes(Eve) = 0 n-k n-k ? ...

  28. Example: First-Order Model Counting 6. Logical sentence Domain n people ∀ x,y, Smokes(x) ∧ Friends(x,y) ⇒ Smokes(y) ● If we know precisely who smokes, and there are k smokers Database: Smokes(Alice) = 1 k k Smokes(Bob) = 0 Smokes(Charlie) = 0 ? Smokes(Dave) = 1 Smokes(Eve) = 0 n-k n-k ...

  29. Example: First-Order Model Counting 6. Logical sentence Domain n people ∀ x,y, Smokes(x) ∧ Friends(x,y) ⇒ Smokes(y) ● If we know precisely who smokes, and there are k smokers Database: Smokes(Alice) = 1 k k Smokes(Bob) = 0 Smokes(Charlie) = 0 ? Smokes(Dave) = 1 Smokes(Eve) = 0 n-k n-k ...

  30. Example: First-Order Model Counting 6. Logical sentence Domain n people ∀ x,y, Smokes(x) ∧ Friends(x,y) ⇒ Smokes(y) ● If we know precisely who smokes, and there are k smokers → models

  31. Example: First-Order Model Counting 6. Logical sentence Domain n people ∀ x,y, Smokes(x) ∧ Friends(x,y) ⇒ Smokes(y) ● If we know precisely who smokes, and there are k smokers → models ● If we know that there are k smokers

  32. Example: First-Order Model Counting 6. Logical sentence Domain n people ∀ x,y, Smokes(x) ∧ Friends(x,y) ⇒ Smokes(y) ● If we know precisely who smokes, and there are k smokers → models ● If we know that there are k smokers → models

  33. Example: First-Order Model Counting 6. Logical sentence Domain n people ∀ x,y, Smokes(x) ∧ Friends(x,y) ⇒ Smokes(y) ● If we know precisely who smokes, and there are k smokers → models ● If we know that there are k smokers → models ● In total

  34. Example: First-Order Model Counting 6. Logical sentence Domain n people ∀ x,y, Smokes(x) ∧ Friends(x,y) ⇒ Smokes(y) ● If we know precisely who smokes, and there are k smokers → models ● If we know that there are k smokers → models ● In total → models

  35. The Full Pipeline MLN 3.14 Smokes(x) ∧ Friends(x,y) ⇒ Smokes(y)

  36. The Full Pipeline MLN 3.14 Smokes(x) ∧ Friends(x,y) ⇒ Smokes(y) ⇔ Smokes(x) ∧ Friends(x,y) ⇒ Smokes(y) ] ∀ x,y, F(x,y) [ Relational Logic

  37. The Full Pipeline MLN 3.14 Smokes(x) ∧ Friends(x,y) ⇒ Smokes(y) ⇔ Smokes(x) ∧ Friends(x,y) ⇒ Smokes(y) ] ∀ x,y, F(x,y) [ Relational Logic Smokes → 1 ¬Smokes → 1 Friends → 1 ¬Friends → 1 F → exp(3.14) ¬F → 1 Weight Function

  38. The Full Pipeline ∀ x,y, F(x,y) ⇔ Smokes(x) [ ∧ Friends(x,y) ⇒ Smokes(y) ] Relational Logic First-Order d-DNNF Circuit

  39. The Full Pipeline Smokes → 1 ¬Smokes → 1 Friends → 1 ¬Friends → 1 F → exp(3.14) ¬F → 1 Weight Function Alice Bob First-Order d-DNNF Circuit Charlie Domain Weighted First-Order Model Count is 1479.85

  40. The Full Pipeline Smokes → 1 ¬Smokes → 1 Friends → 1 ¬Friends → 1 F → exp(3.14) ¬F → 1 Weight Function Alice Bob First-Order d-DNNF Circuit Charlie Domain Weighted First-Order Model Count is 1479.85 Circuit evaluation is polynomial in domain size!

  41. Assembly Language for Lifted Probabilistic Inference Computing conditional probabilities with: – Parfactor graphs – Markov logic networks – Probabilistic datalog/logic programs – Probabilistic databases – Relational Bayesian networks All reduces to weighted (first-order) model counting [VdB-IJCAI11, Gogate-UAI11, VdB-KR14, Gribkoff-UAI14]

  42. Overview 1. What are statistical relational models? 2. What is lifted inference? 3. How does lifted inference work? 4. Theoretical insights 5. Practical applications

  43. Liftability Framework ● Domain-lifted algorithms run in time polynomial in the domain size (~data complexity). ● A class of inference tasks C is liftable iff there exists an algorithm that – is domain-lifted and – solves all problems in C. ● Such an algorithm is complete for C. ● Liftability depends on the type of task. [VdB-NIPS11, Jaeger-StarAI12]

  44. Liftable Classes (of model counting problems)

  45. Liftable Classes Monadic [VdB-NIPS11]

  46. Liftable Classes Monadic FO 2 CNF [VdB-NIPS11]

  47. Liftable Classes Monadic FO 2 CNF FO 2 [VdB-KR14]

  48. Liftable Classes Monadic Safe FO 2 monotone CNF CNF FO 2 [Dalvi-JACM12]

  49. Liftable Classes Monadic Safe FO 2 monotone CNF CNF Safe type-1 or FO 2 monotone CNF [Gribkoff-UAI14]

  50. Liftable Classes Monadic Safe FO 2 monotone CNF CNF Safe type-1 or FO 2 monotone CNF [Jaeger-StarAI12,Jaeger-TPLP12 ]

  51. Liftable Classes Monadic Safe FO 2 monotone CNF CNF Safe type-1 or FO 2 monotone CNF

  52. Positive Liftability Result X Y

  53. Positive Liftability Result Properties Properties Smokes(x) Smokes(y) Gender(x) Gender(y) X Y Young(x) Young(y) Tall(x) Tall(y)

  54. Positive Liftability Result Properties Relations Properties Smokes(x) Friends(x,y) Smokes(y) Gender(x) Colleagues(x,y) Gender(y) X Y Young(x) Family(x,y) Young(y) Tall(x) Classmates(x,y) Tall(y)

Recommend


More recommend