chapter9
play

Chapter9 Inference in First-Order Logic 20070510 Chap9 1 - PDF document

Chapter9 Inference in First-Order Logic 20070510 Chap9 1 Inference Rules Involving Quantifiers Substitution SUBST ( , ) refers to applying the substitution or binding list to the sentence . SUBST ({ x / Sam , y / Pam } ,


  1. Chapter9 Inference in First-Order Logic 20070510 Chap9 1 Inference Rules Involving Quantifiers • Substitution SUBST ( θ , α ) refers to applying the substitution or binding list θ to the sentence α . SUBST ({ x / Sam , y / Pam } , Likes ( x , y )) = Likes ( Sam , Pam ) • Inference rules Modus Ponens And-Elimination And-Introduction + 3 New Inference Rules with Quantifiers Or-Introduction Resolution 20070510 Chap9 2 1

  2. Three New Inference Rules with Quantifiers • Universal Instantiation (UI, or Universal Elimination) : a universally quantified variable can be replaced with any real instance of that type ∀ v α ∀ x Likes(x, IceCream) SUBST({ v / g }, α ) Likes(Ben, IceCream) • Existential Instantiation (EI, or Existential Elimination) : an existentially quantified variable can be replaced with a new symbol (i.e. Skolem constant , give it a name that has not yet been used ) ∃ v α ∃ x Likes(x, IceCream) SUBST({ v / g }, α ) Likes(Person1, IceCream) 20070510 Chap9 3 Three New Inference Rules with Quantifiers (cont.) • Existential Introduction : a real object name can be replaced by an existentially quantified variable α Likes(Jerry, IceCream) ∃ v SUBST({ g / v }, α ) ∃ x Likes(x, IceCream) 20070510 Chap9 4 2

  3. Reduction to Propositional Inference By instantiating the universal sentences in all ways, we can • have the new KB is propositionalized . ∀ x King(x) ∧ Greedy(x) ⇒ Evil(x) King(John) ∧ Greedy(John) ⇒ Evil(John) King(Richard) ∧ Greedy(Richard) ⇒ Evil(Richard) King(John) King(John) Greedy(John) Greedy(John) Brother(Richard, John) Brother(Richard, John) A ground sentence is entailed by new KB iff entailed by • original KB. Every FOL KB can be propositionalized so as to preserve • entailment. Idea: propositionalized KB and query, apply resolution, • return result 20070510 Chap9 5 Reduction to Propositional Inference (cont.-1) Problem: When the KB includes a function symbol, • the set of possible ground term substitutions is infinite. e.g. Father(Father(Father(John ))) Theorem: Herbrand (1930) • If a sentence is entailed by the original FOL KB, there is a proof involving just a finite subset of the propositionalized KB. Idea: For n = 0 to ∞ do • Create a proposional KB by instating with depth-n terms see if the sentence α is entailed by this KB. 20070510 Chap9 6 3

  4. Reduction to Propositional Inference (cont.-2) Problem: works if α is entailed, • loops if α is not entailed. Theorem: Turing (1936), Church (1936) • Entailment in FOL is semidecidable . i.e. algorithms exists that say yes to every entailed sentence, but no algorithm exists that also say no to every nonentailed sentence. 20070510 Chap9 7 Problems with Propositionalization Propositionalization approach is rather inefficient. • it seems to generate lots of irrelevant sentences. ∀ x King(x) ∧ Greedy(x) ⇒ Evil(x) King(John) ∀ x Greedy(x) Brother(Richard, John) e.g. The following two are also generated: King(Richard) ∧ Greedy(Richard) ⇒ Evil(Richard) Greedy(Richard) With p k -ary predicates and n constants, • there are p*n k instantiations. 20070510 Chap9 8 4

  5. Unification To take two atomic sentences p and q and return a • substitution θ would make p and q look the same. Unify( p , q ) = θ where Subst( θ , p ) = Subst( θ , q ) θ is called the unifier of the two sentences. 20070510 Chap9 9 Unification (cont.) Let our knowledge base contain the following sentences: Knows(John, Jane) Knows(y, Leonid) Knows(y, Mom(y)) Knows(x, Elizabeth) θ p q Knows(John, x) Knows(John, Jane) {x/Jane} Knows(John, x) Knows(y, Leonid) {x /Leonid, y/John} Knows(John, x) Knows(y, Mom(y)) {y/John, x/Mom(John)} Knows(John, x) Knows (x, Elizabeth) fail 20070510 Chap9 10 5

  6. Standardize Apart To rename the variables of one or both to avoid name clashes. • i.e. To eliminate overlap of variables θ p q Knows(John, x) Knows (x, Elizabeth) fail Knows(John, x1) Knows (x2, Elizabeth) {x1/ Elizabeth, x2/John} 20070510 Chap9 11 Most General Unifier (MGU) To unify Knows(John, x) and Knows (y, z), θ = {y/John, x/z} or {y/John, x/z, w/Freda} or {y/John, x/John, z/John} or …. The first unifier is more general than the second . The substitution that makes the least • commitment about the bindings variables. There is a single MGU that is unique up to • renaming of variables. MGU is {y/John, x/z} 20070510 Chap9 12 6

  7. Unification Algorithm When matching a variable against a complex term, one must check whether the variable itself occurs inside the term; if it does, the match fails. 20070510 Chap9 13 Generalized Modus Ponens (GMP) For atomic sentences p i , p i ’ , and q , where there is a substitution θ such that Subst( θ , p i ’ ) = Subst( θ , p i ) for all i : p 1 ’ Owns(Nono, M1) … p n ’ Missile(M1) p 1 ∧ … ∧ p n ⇒ q Owns(Nono , x) ∧ Missile(x) ⇒ Sells(West, Nono, x) Subst( θ , q ) Sells(West, Nono, M1) 20070510 Chap9 14 7

  8. Generalized Modus Ponens (cont.) • The Generalized Modus Ponens is a lifted version of Modus Ponens --- it raises Modus Ponens from propositional to first-order logic. • The generalized modus ponens is sound . • The generalized modus ponens is complete for Horn databases. • The generalized modus ponens is efficient. 20070510 Chap9 15 Soundness of GMP Need to show that • p 1 ', … , p n ', (p 1 ∧ … ∧ p n ⇒ q) |= q θ provided that p i ' θ = p i θ for all I LEMMA : For any sentence p , we have p |= p θ by UI • (p 1 ∧ … ∧ p n ⇒ q) |= (p 1 ∧ … ∧ p n ⇒ q) θ = (p 1 θ ∧ … ∧ p n θ ⇒ q θ ) p 1 ', … , p n ' |= p 1 ' ∧ … ∧ p n ' |= p 1 ' θ ∧ … ∧ p n ' θ From 1 and 2, q θ follows by ordinary Modus Ponens. 20070510 Chap9 16 8

  9. First-Order Definite Clause To use the generalized Modus Ponens in inference, each sentence in the KB must be either • an atomic sentence , or • an implication whose antecedent is a conjunction of positive literals and whose consequent is a single positive literal. i.e. a 1 ∧ … ∧ a m ⇒ b 20070510 Chap9 17 First-Order Definite Clause (cont.) • Not every knowledge base can be converted into a set of definite clauses, because of the single- positive literal restriction. • Convert sentences into definite clause when they are first entered into the knowledge base using - Existential Elimination - And-Elimination e.g, ∃ x Owns(Nono, x) ∧ Missile(x) is converted into two atomic definite clauses: Owns(Nono, M1) Missile(M1) 20070510 Chap9 18 9

  10. Example knowledge base • The law says that it is a crime for an American to sell weapons to hostile nations. The country Nono, an enemy of America, has some missiles, and all of its missiles were sold to it by Colonel West, who is American. Prove that Col. West is a criminal • 20070510 Chap9 19 Example knowledge base (cont.) ... it is a crime for an American to sell weapons to hostile nations: American(x) ∧ Weapon(y) ∧ Sells(x,y,z) ∧ Hostile(z) ⇒ Criminal(x) Nono … has some missiles, i.e., ∃ x Owns(Nono,x) ∧ Missile(x ): Owns(Nono,M 1 ) ∧ Missile(M 1 ) … all of its missiles were sold to it by Colonel West Missile(x) ∧ Owns(Nono,x) ⇒ Sells(West,x,Nono) Missiles are weapons: Missile(x) ⇒ Weapon(x) An enemy of America counts as "hostile“: Enemy(x,America) ⇒ Hostile(x) West, who is American … American(West) The country Nono, an enemy of America … Enemy(Nono,America) 20070510 Chap9 20 10

  11. Revised Example Proof To solve problem using Generalized Modus Ponens, we need to put the original knowledge into definite clause form. 1. American(x) ∧ Weapon(y) ∧ Nation(z) ∧ Hostile(z) ∧ Sells(x , y, z) ⇒ Criminal (x) 2. Owns(Nono, M1) 3. Missile(M1) 4. Owns(Nono, x) ∧ Missile(x) ⇒ Sells(West, Nono, x) 5. Missile(x) ⇒ Weapon(x) 6. Enemy(x, America) ⇒ Hostile(x) 7. American(West) 8. Nation(Nono) 9. Enemy(Nono, America) 10. Nation(America) 11. Criminal(West) ? MP(3, 5)=12 Weapon(M1) MP(6, 9) =13 Hostile(Nono) MP(2,3,4)=14 Sells(West, Nono, M1) MP(7, 12, 8, 13, 14, 1) Criminal(West) 20070510 Chap9 21 Inference Chaining • Generalized Modus Ponens gives us a natural, intuitive, and reasonably powerful tool to use for inference. • There are two types of inference that differ simply in direction. • Forward Chaining starts with new premises and tries to generate all new conclusions. • Backward Chaining begins from a desired conclusion and attempts to realize the necessary implications and premises required to arrive at it. 20070510 Chap9 22 11

  12. Inference Procedures • Forward Chaining - Data-driven - Triggered by adding a new fact - Premises ⇒ Consequent(s) - Renaming - Composition : Subst(Compose( θ 1 , θ 2 ) , p ) = Subst( θ 2 , Subst( θ 1 , p )) • Backward Chaining - Goal-driven - Triggered by a query , i.e. Ask - Premises ⇐ Consequent - Build up the unifier as it goes 20070510 Chap9 23 Forward Chaining Algorithm 20070510 Chap9 24 12

  13. Forward Chaining Proof 20070510 Chap9 25 Forward Chaining Proof (cont.-1) 20070510 Chap9 26 13

More recommend