tensor based abduction
play

Tensor-Based Abduction in Horn Propositional Programs Yaniv Aspis, - PowerPoint PPT Presentation

September 2018 Tensor-Based Abduction in Horn Propositional Programs Yaniv Aspis, Krysia Broda, Alessandra Russo Department of Computing, Imperial College London {yaniv.aspis17,k.broda,a.russo}@imperial.ac.uk Why Linear Algebra? AI


  1. September 2018 Tensor-Based Abduction in Horn Propositional Programs Yaniv Aspis, Krysia Broda, Alessandra Russo Department of Computing, Imperial College London {yaniv.aspis17,k.broda,a.russo}@imperial.ac.uk

  2. Why Linear Algebra? • AI software is moving towards GPU-based solutions • Optimized for matrix/tensor multiplication • Highly-Parallel computations • Develop algorithms for GPU

  3. Abduction • Logical Inference through Explanations 𝑄, 𝑕, 𝐵𝑐 • Assume 𝑄 is a Horn Logic Program • Observation 𝑕 • Find Δ ⊆ 𝐵𝑐 such that 𝑕 ∈ 𝑀𝐼𝑁(𝑄 ∪ Δ ) • 𝑄 ∪ Δ is consistent

  4. Background: Embedding Atoms [1] 0 0 1 𝑤 𝑕 = 1 𝑕 ← 𝑞 ∧ 𝑟 0 0 𝑤 ⊥ = 0 0 𝑞 ← 𝑟 0 0 1 0 𝑟 ← 0 𝑤 𝑕,𝑟 = 0 1 𝑤 𝑞 = 0 0 ← 𝑕 0 1 1 1 0 𝑤 ⊤ = 0 0 0 0 0 𝑤 𝑟 = 0 0 1 1. Sakama, C., Inoue, K., Sato, T.: Linear Algebraic Characterization of Logic Programs. In: KSEM. Lecture Notes in Computer Science, vol. 10412, pp. 520-533. Springer (2017)

  5. Background: Embedding Programs [1] 𝑞 𝑟 𝑕 ⊥ ⊤ 1 0 ⊥ 0 1 0 𝑕 ← 𝑞 ∧ 𝑟 1 1 1 1 1 ⊤ 𝑞 ← 𝑟 1 1 𝐸 𝑄 = 𝑕 1 𝑟 ← 0 1 2 2 ← 𝑕 𝑞 1 1 0 0 1 𝑟 1 1 1 0 0 Immediate Consequence via matrix multiplication: 𝐼 1 𝑦 = 0, 𝑦 < 1 𝐾 = 𝐼 1 𝐸 𝑄 ⋅ 𝑤 𝐾 = 𝑈 𝑄 𝐽 ∪ 𝐽 ⇔ 𝑤 𝐽 𝑦 ≥ 1 1, 1. Sakama, C., Inoue, K., Sato, T.: Linear Algebraic Characterization of Logic Programs. In: KSEM. Lecture Notes in Computer Science, vol. 10412, pp. 520-533. Springer (2017)

  6. Abduction - Main Idea Invert implications: 𝑞, 𝑟 𝑕 ← 𝑞 ∧ 𝑟 𝑕 ← 𝑟 𝑟 𝑕 𝑕 ← 𝑢 𝑢 Potential solutions: 𝑞, 𝑟 , 𝑟 , 𝑢 , 𝑕

  7. Tensor Embedding ⊥ ⊤ 𝑕 𝑞 𝑟 𝑢 1 0 0 0 0 0 ⊥ 0 1 0 0 0 0 ⊤ 0 0 0 0 0 0 𝑕 𝑞 0 0 1 1 0 0 𝑟 0 0 1 0 1 0 𝑢 0 0 0 0 0 1 𝑕 → 𝑞 ∧ 𝑟

  8. 1 0 0 0 0 0 1 0 0 0 0 0 ⊥ ⊥ 0 1 0 0 0 0 0 1 0 0 0 0 ⊤ ⊤ 0 0 0 0 0 0 0 0 0 0 0 0 𝑕 𝑕 𝐵 ∷1 = 𝐵 ∷2 = 𝑞 𝑞 0 0 1 1 0 0 0 0 0 1 0 0 𝑟 𝑟 0 0 1 0 1 0 0 0 1 0 1 0 𝑢 𝑢 0 0 0 0 0 1 0 0 0 0 0 1 𝑕 → 𝑞 ∧ 𝑟 𝑕 → 𝑟 1 0 0 0 0 0 1 0 0 0 0 0 ⊥ ⊥ 0 1 0 0 0 0 0 1 0 0 0 0 ⊤ ⊤ 0 0 0 0 0 0 0 0 1 0 0 0 𝑕 𝑕 𝐵 ∷3 = 𝐵 ∷4 = 0 0 0 1 0 0 𝑞 0 0 0 1 0 0 𝑞 𝑟 𝑟 0 0 0 0 1 0 0 0 0 0 1 0 𝑢 𝑢 0 0 1 0 0 1 0 0 0 0 0 1 𝑕 → 𝑢 Keep 𝑕

  9. Frontal Slices 1 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 0 1 0 1 0 0 0 0 0 0 1 Third-Order 𝐵 ∷𝑙 Tensor 𝐵

  10. Abductive Step 0 0 0 0 0 ⊥ 1 1 1 1 1 ⊤ 1 0 0 0 1 𝑕 𝐼 1 𝐵 × 2 = 0 1 0 0 0 𝑞 𝑟 0 1 1 0 0 𝑢 0 0 0 1 0 𝑕 ⇒ 𝑞, 𝑟 , 𝑟 , 𝑢 , 𝑕

  11. Inconsistencies 0 0 0 0 ⊥ 𝑕 ← 𝑞 ∧ 𝑟 1 1 1 1 ⊤ 𝑕 ← 𝑟 0 0 0 1 𝑕 So far: 1 0 0 0 𝑞 𝑕 ← 𝑢 𝑟 1 1 0 0 ← 𝑢 𝑢 0 0 1 0 Idea: Compute 𝑀𝐼𝑁(𝑄 ∪ Δ) for each column

  12. Inconsistencies 0 0 0 0 0 0 1 0 ⊥ 1 1 1 1 1 1 1 1 ⊤ ⇒ 0 0 0 1 1 1 1 1 𝑕 𝑞 1 0 0 0 1 0 1 0 𝑀𝐼𝑁 𝑟 1 1 1 0 1 1 0 0 𝑢 0 0 1 0 0 0 1 0

  13. Inconsistencies Inconsistent! 0 0 0 0 0 0 1 0 ⊥ 1 1 1 1 1 1 1 1 ⊤ ⇒ 0 0 0 1 1 1 1 1 𝑕 𝑞 1 0 0 0 1 0 1 0 𝑀𝐼𝑁 𝑟 1 1 1 0 1 1 0 0 𝑢 0 0 1 0 0 0 1 0

  14. Inconsistencies Inconsistent! 0 0 0 0 0 0 1 0 ⊥ 1 1 1 1 1 1 1 1 ⊤ ⇒ 0 0 0 1 1 1 1 1 𝑕 𝑞 1 0 0 0 1 0 1 0 𝑀𝐼𝑁 𝑟 1 1 1 0 1 1 0 0 𝑢 0 0 1 0 0 0 1 0 Remove inconsistencies, duplicates, and continue to Abductive Step…

  15. Abducibles • Δ ⊆ 𝐵𝑐 if and only if: 𝑤 Δ × 𝑤 𝐵𝑐 = 𝑤 Δ • Post-Filtering • If Abducibles are not defined, then Δ ⊆ 𝐵𝑐 implies 𝐼 1 𝐵 𝑄 × 2 𝑤 Δ = 𝑤 Δ after duplicates are removed (Filter during run)

  16. Discussion and Future Work • Proof of correctness has been completed • An unoptimised implementation has been made Future Work: • Clauses with negation • First-Order Predicate Logic • Optimisation and Scalability Testing

  17. Tensor Multiplication × = = × = Flatten

  18. Multiple Definitions 1 0 0 0 0 0 0 0 ⊥ 1 1 1 1 1 1 𝑕 ← 𝑞 ∧ 𝑟 1 1 ⊤ 1 1 1 0 1 𝑕 1 0 0 𝑕 ← 𝑞 ∧ 𝑠 𝐼 ⋅ = 2 2 2 0 0 𝑞 1 0 0 0 0 0 1 0 𝑟 1 0 0 0 0 0 𝑠 1 0 1 0 0 0 0 0 Introduce auxiliary variables: 𝑕 1 ← 𝑞 ∧ 𝑟 𝑕 ← 𝑕 1 𝑕 2 ← 𝑞 ∧ 𝑠 𝑕 ← 𝑕 2

Recommend


More recommend