today reminder
play

Today Reminder See Russell and Norvig, chapter 7 Syntax : - PowerPoint PPT Presentation

1 2 Today Reminder See Russell and Norvig, chapter 7 Syntax : proposition symbols, joined with , , , , . Semantics : truth values, logical consequence KB | = F Propositional Logic ctd special formulas:


  1. 1 2 Today Reminder See Russell and Norvig, chapter 7 • Syntax : proposition symbols, joined with ¬ , ∧ , ∨ , ⇒ , ⇔ . • Semantics : truth values, logical consequence KB | = F • Propositional Logic ctd • special formulas: • Inference algorithms valid : true in all interpretations satisfiable : true in some interpretations contradictory : true in no interpretations Alan Smaill Fundamentals of Artificial Intelligence Nov 13, 2008 Alan Smaill Fundamentals of Artificial Intelligence Nov 13, 2008 3 4 Inference by enumeration Logical equivalence Two sentences are logically equivalent iff true in same models: Depth-first enumeration of all models is sound and complete α ≡ β if and only if α | = β and β | = α function TT-Entails? ( KB , α ) returns true or false ( α ∧ β ) ≡ ( β ∧ α ) commutativity of ∧ symbols ← a list of the proposition symbols in KB and α return TT-Check-All ( KB , α , symbols , [ ] ) ( α ∨ β ) ≡ ( β ∨ α ) commutativity of ∨ (( α ∧ β ) ∧ γ ) ≡ ( α ∧ ( β ∧ γ )) associativity of ∧ function TT-Check-All ( KB , α , symbols , model ) returns true or false (( α ∨ β ) ∨ γ ) ≡ ( α ∨ ( β ∨ γ )) associativity of ∨ if Empty? ( symbols ) then ¬ ( ¬ α ) ≡ α double-negation elimination if PL-True? ( KB , model ) then return PL-True? ( α , model ) ( α ⇒ β ) ≡ ( ¬ β ⇒ ¬ α ) contraposition else return true ¬ ( α ∧ β ) ≡ ( ¬ α ∨ ¬ β ) de Morgan else do ¬ ( α ∨ β ) ≡ ( ¬ α ∧ ¬ β ) de Morgan P ← First ( symbols ); rest ← Rest ( symbols ) return TT-Check-All ( KB , α , rest , Extend ( P , true , model )) and ( α ∧ ( β ∨ γ )) ≡ (( α ∧ β ) ∨ ( α ∧ γ )) distributivity of ∧ over ∨ TT-Check-All ( KB , α , rest , Extend ( P , false , model )) ( α ∨ ( β ∧ γ )) ≡ (( α ∨ β ) ∧ ( α ∨ γ )) distributivity of ∨ over ∧ Alan Smaill Fundamentals of Artificial Intelligence Nov 13, 2008 Alan Smaill Fundamentals of Artificial Intelligence Nov 13, 2008

  2. 5 6 Proof methods Forward and backward chaining Proof methods divide into (roughly) two kinds: Horn Form (restricted) KB = conjunction of Horn clauses also called definite clauses Horn clause = Application of inference rules ♦ proposition symbol; or – Legitimate (sound) generation of new sentences from old ♦ (conjunction of symbols) ⇒ symbol – Proof = a sequence of inference rule applications E.g., C ∧ ( B ⇒ A ) ∧ ( C ∧ D ⇒ B ) Can use inference rules as operators in a standard search alg. Modus Ponens (for Horn Form): complete for Horn KBs Model checking α 1 ∧ · · · ∧ α n ⇒ β α 1 , . . . , α n , truth table enumeration (always exponential in n ) β improved backtracking, heuristic search in model space (sound but incomplete) Can be used with forward chaining or backward chaining. e.g., min-conflicts-like hill-climbing algorithms These algorithms are very natural and run in linear time Alan Smaill Fundamentals of Artificial Intelligence Nov 13, 2008 Alan Smaill Fundamentals of Artificial Intelligence Nov 13, 2008 7 8 Forward chaining Forward chaining algorithm Idea: fire any rule whose premises are satisfied in the KB , function PL-FC-Entails? ( KB , q ) returns true or false add its conclusion to the KB , until query is found local variables : count , table, indexed by clause, initially no. of premises inferred , table, indexed by symbol, each entry initially false Q agenda , list of symbols, initially symbols known to be true P ⇒ Q while agenda is not empty do L ∧ M ⇒ P P p ← Pop ( agenda ) B ∧ L ⇒ M unless inferred [ p ] do M A ∧ P ⇒ L inferred [ p ] ← true for each Horn clause c in whose premise p appears do A ∧ B ⇒ L L decrement count [ c ] A if count [ c ] = 0 then do B if Head [ c ] = q then return true A B Push ( Head [ c ], agenda ) return false Alan Smaill Fundamentals of Artificial Intelligence Nov 13, 2008 Alan Smaill Fundamentals of Artificial Intelligence Nov 13, 2008

  3. 9 10 Forward chaining example Forward chaining example Q Q 1 1 P P 2 2 M M 2 2 L L 2 2 1 1 A B A B Alan Smaill Fundamentals of Artificial Intelligence Nov 13, 2008 Alan Smaill Fundamentals of Artificial Intelligence Nov 13, 2008 11 12 Forward chaining example Forward chaining example Q Q 1 1 P P 2 1 M M 1 0 L L 1 0 1 0 A B A B Alan Smaill Fundamentals of Artificial Intelligence Nov 13, 2008 Alan Smaill Fundamentals of Artificial Intelligence Nov 13, 2008

  4. 13 14 Forward chaining example Forward chaining example Q Q 1 0 P P 0 0 M M 0 0 L L 1 0 0 0 A B A B Alan Smaill Fundamentals of Artificial Intelligence Nov 13, 2008 Alan Smaill Fundamentals of Artificial Intelligence Nov 13, 2008 15 16 Forward chaining example Forward chaining example Q Q 0 0 P P 0 0 M M 0 0 L L 0 0 0 0 A B A B Alan Smaill Fundamentals of Artificial Intelligence Nov 13, 2008 Alan Smaill Fundamentals of Artificial Intelligence Nov 13, 2008

  5. 17 18 Proof of completeness Backward chaining FC derives every atomic sentence that is entailed by KB Idea: work backwards from the query q : to prove q by BC, 1. FC reaches a fixed point where no new atomic sentences are derived. check if q is known already, or 2. Consider the final state as a model m , assigning true/false to prove by BC all premises of some rule concluding q symbols. Avoid loops: check if new subgoal is already on the goal stack 3. Every clause in the original KB is true in m Avoid repeated work: check if new subgoal Proof : Suppose a clause a 1 ∧ . . . ∧ a k ⇒ b is false in m . 1) has already been proved true, or Then a 1 ∧ . . . ∧ a k is true in m and b is false in m . 2) has already failed Therefore the algorithm has not reached a fixed point! 4. Hence m is a model of KB . 5. If KB | = q , then q is true in every model of KB , including m . Alan Smaill Fundamentals of Artificial Intelligence Nov 13, 2008 Alan Smaill Fundamentals of Artificial Intelligence Nov 13, 2008 19 20 Backward chaining example Backward chaining example Q Q P P M M L L A B A B Alan Smaill Fundamentals of Artificial Intelligence Nov 13, 2008 Alan Smaill Fundamentals of Artificial Intelligence Nov 13, 2008

  6. 21 22 Backward chaining example Backward chaining example Q Q P P M M L L A B A B Alan Smaill Fundamentals of Artificial Intelligence Nov 13, 2008 Alan Smaill Fundamentals of Artificial Intelligence Nov 13, 2008 23 24 Backward chaining example Backward chaining example Q Q P P M M L L A B A B Alan Smaill Fundamentals of Artificial Intelligence Nov 13, 2008 Alan Smaill Fundamentals of Artificial Intelligence Nov 13, 2008

  7. 25 26 Backward chaining example Backward chaining example Q Q P P M M L L A B A B Alan Smaill Fundamentals of Artificial Intelligence Nov 13, 2008 Alan Smaill Fundamentals of Artificial Intelligence Nov 13, 2008 27 28 Backward chaining example Backward chaining example Q Q P P M M L L A B A B Alan Smaill Fundamentals of Artificial Intelligence Nov 13, 2008 Alan Smaill Fundamentals of Artificial Intelligence Nov 13, 2008

  8. 29 30 Backward chaining example Forward vs. backward chaining Q FC is data-driven, cf. automatic, unconscious processing, e.g., object recognition, routine decisions May do lots of work that is irrelevant to the goal P BC is goal-driven, appropriate for problem-solving, e.g., Where are my keys? How do I get into a PhD programme? M Complexity of BC can be much less than linear in size of KB L A B Alan Smaill Fundamentals of Artificial Intelligence Nov 13, 2008 Alan Smaill Fundamentals of Artificial Intelligence Nov 13, 2008 31 32 Pros and cons of propositional logic First-order logic Propositional logic is declarative : Whereas propositional logic assumes world contains facts , pieces of syntax correspond to facts first-order logic (like natural language) assumes the world contains Propositional logic allows partial/disjunctive/negated information • Objects: people, houses, numbers, theories, Jacques Chirac, (unlike most data structures and databases) colours, soduko games, wars, centuries . . . Propositional logic is compositional : meaning of B 1 , 1 ∧ P 1 , 2 is derived from meaning of B 1 , 1 and of P 1 , 2 • Relations: red, round, bogus, prime, multistoried, Meaning in propositional logic is context-independent brother of, bigger than, inside, part of, has colour, occurred after, (unlike natural language, where meaning depends on context) owns, comes between, . . . Propositional logic has very limited expressive power • Functions: father of, best friend, second innings of, one more than, (unlike natural language) beginning of . . . E.g., cannot say “pits cause breezes in adjacent squares” except by writing one sentence for each square Alan Smaill Fundamentals of Artificial Intelligence Nov 13, 2008 Alan Smaill Fundamentals of Artificial Intelligence Nov 13, 2008

Recommend


More recommend