1 RN, Chapter 7.4 - 7.8 Propositional Logic
Logical Agents Reasoning [Ch 7 – 7.3] � � Propositional Logic [Ch 7.4 - 7.8] � Syntax Semantics � � Models � Entailment Proof Process � Forward / Backward chaining � Resolution � Predicate Calculus � Representation [Ch 8] � Inference [Ch 9] � Implemented Systems [Ch 10] � Applications [Ch 8.4,10] � Planning [Ch 11] � 2
Logic in General Logics are formal languages for representing information such that conclusions can be drawn 3
“Well, I dunno… Okay, sounds good to me.” 4
5
Components of a Logic � Syntax defines the sentences in the language ... what does it look like? � Semantics define “meaning” of sentences; i.e., define truth of a sentence in a world. How is it linked to the world? � Proof Process “new facts from old” find implicit information... “pushing symbols” � Eg, wrt arithmetic � x+2 ≥ y is sentence; x2+y > is not � x+2 ≥ y is true iff the number x+2 is no less than the number y � x+2 ≥ y is true in a world where x = 7; y = 1 � x+2 ≥ y is false in a world where x = 0; y = 6 6
Propositional Logic: Syntax � Atomic Propositions… “basic statements about world” � W 3,4 : Wumpus at location [ 3, 4 ] � S 1,1 : Stench at location [ 1, 1 ] � ... � Build sentences from atomic propositions using connectives � Eg: 7
Semantics... based on Models � “ Model ” ≡ “completely specified possible world” Every claim is either true or false � Propositional case : Complete assignment � Eg, A B C D ⊨ m 1 A m 1 + 0 0 + “A is true in m 1 ” … “ m 1 is a model of A” � Also... m 1 ⊨ D m 1 ⊨ C � What about… ¬ B ? A v B ? … A & ¬ C & D ? 8
Propositional logic: Semantics � Each model specifies { true, false } for each proposition symbol � Eg, A B C D m 1 0 + 0 + � Rules for evaluating truth wrt model m : 9
Propositional logic: Semantics A B C D m 1 0 0 0 + m ⊨ ? A v ( ∼ B & C) True if either m ⊨ or m ⊨ A ∼ B & C � But m ⊨ A So need m ⊨ ∼ B & C � True if m ⊨ ∼ B and m ⊨ C � m ⊨ ∼ B holds if m ⊨ B True… � So need only m ⊨ C � Fails… 10
Semantics of Connectives ¬ P P ⇒ Q P ⇔ Q P Q P&Q P v Q 0 0 + 0 0 + + 0 + + 0 + + 0 + 0 0 0 + 0 0 + + 0 + + + + � Just need &, ¬ : means ¬ ( ¬ P & ¬ Q) � P v Q means ¬ P v Q � P ⇒ Q ... counterintuitive: truth value of “5 is even ⇒ Sam is smart” ? means ( P ⇒ Q) & (Q ⇒ P) � P ⇔ Q � “&” relatively easy, as complete knowledge � “v”, “ ¬ ” more difficult, as partial information 11
Models of a Formula � Initially all 2 n models are possible � Assertion α ELIMINATES possible worlds Eg, ¬ A eliminates models m where m ⊨ A � M ( α ) = { m | m ⊨ α } is set of all models of α � M ( ¬ A ) = 12
Example of Entailment � Initially: � Background knowledge: Tell( KB, “S 12 ⇒ W 11 v W 13 ”) � Alive at start… Tell( KB, “ ¬ W 11 ”) � Smell something. . . Tell( KB, “S 12 ”) � Is Wumpus @ [ 1, 3 ] ? YES! Don’t know! � Is Gold @ [ 4, 3 ] ? 14
What to believe? � Suppose you believe KB, and KB ⊨ α Then you should believe α ! � Why? 1. “Believe KB” ⇒ Real world m RW in M (KB) 2. KB ⊨ α means M (KB) ⊆ M ( α ) ⇒ m RW ∈ M ( α ) ⊨ . . . m RW α Ie, α holds in the Real World, so you should believe it! 16
Translate Knowledge into Action � Include LOTS of rules like ⇒ ¬ Forward A 1,1 & East A & W 2,1 � Observations re World � Try to prove one of… { Forward, Turn Left, ..., Shoot } � After proof KB ⊢ Action perform Action 17
Comments on Logic 1. Why reason? Entailment ⊨ Inference ⊢ 2. vs 3. Relation to world... 4. Succinct Representation ? 18
Issue#2:Entailment vs Derivation KB ⊨ α � Entailment ⊢ N (KB, α ) = No Semantic Relation: α MUST hold whenever KB holds ⊢ A (KB, α ) = Yes iff | α | = 1 KB ⊢ i α � Derivation Computational (Syntactic) Process: ⊢ 1S (KB, α ) = Maps 〈 KB, α 〉 to { Yes, No } Yes iff 1-step derivation � ⊢ i can be arbitrary but... want ⊢ i that corresponds to ⊨ ! � GOAL: ⊢ SC that returns all+only entailments: For any KB, , KB ⊢ SC α if-and-only-if KB ⊨ α 20
Properties of Derivation Process Only 1 ⊨ , but many possible proof procedures ⊢ i � � ⊢ i is Sound iff ⊢ i ONLY returns facts that must be true KB ⊢ i KB ⊨ ∀ KB, ρ ρ ⇒ ρ � ⊢ i is Complete iff ⊢ i returns every fact that must be true KB ⊨ KB ⊢ i ∀ KB, ρ ρ ⇒ ρ If ⊢ is SOUND+COMPLETE, ≡ ⊢ ⊨ ⇒ ⇒ Computer can IGNORE SEMANTICS and just push symbols! 21
Tenuous Link to Real World � Challenge: “world” is not in computer . . . only a “representation" of world � Computer only has sentences (hopefully about world) ... sensors can provide some grounding 23
Proof Process � KB = { φ j } …= SET of information “pieces” … called “propositions” φ j � Any rep'n will only explicitly include SOME of the true propositions � Proof process specifies which other propositions to believe Agent that believes KB, will also believe DERIVED propositions written KB ⊦ i ρ … called “derives" (deduces, proves, ... ) � Eg: ⊦ i Socrates is man Socrates is mortal All men are mortal 27
Proof Methods � Model checking … “truth table enumeration” (sound and complete for propositional) Compute complete truth table over k variables S 1,1 , S 1,2 , …, W 1,1 , …, B 1,1 , … Here, ≥ 12 variables ⇒ ≥ 2 12 = 4096 rows Find subset where KB holds; see if α holds in all � Application of inference rules Generate “legitimate” (sound) new sentences from old Proof = a sequence of inference rule applications Can use inference rules as operators in a standard search alg 28
Example of Model Checking � α ≡ A v B C) & (B v ¬ C) KB = (A v A s α i s c KB ⊨ ? α ? o T n r u c l e u d e � e v e K r B y ⊨ t i m � Check all possible models: α e ! K B i s t r u � KB ⊨ α means e , α must be true wherever KB is true 30
Challenges � Model checking is very expensive! models... or ∞ ... needs to consider 2 k ! � Decision about No pit in [ 1, 2 ] does not depend on anything dealing at [3, 4], ... but ⊢ MC still needs to consider combinatorial set of complete models � Other inference processes can be more “local” 31
#2: Applying Inference Rules � Proof Process is mechanic process Implemented by . . . Applying sequence of individual Inference Rules to initial set of propositions, to find new propositions � Each rule is sound . . . (Ie, if believe “antecedent", must believe conclusion) � Uses MONOTONICITY: ⊨ ⊨ If KB1 α , then KB1 ∪ KB2 α Can just deal with subset of propositions � Search issues. . . � which inference rule ? � which propositions ? 32
New Facts from Old: Using I nference Rules 33
Verify Soundness α ⇒ β α β α ⇒ β α + + + � Modus Ponens: + 0 0 β 0 + + � Truth table: 0 0 + � Consider all worlds where { α , α ⇒ β } hold � Observe: β holds here as well! M( α , α ⇒ β , β ) M( α , α ⇒ β ) α ⎛ ⎞ ⎜ ⎟ α ⎛ ⎞ ⎜ ⎟ = α ⇒ β ⎜ ⎟ M ⎜ ⎟ M α ⇒ β ⎝ ⎠ ⎜ ⎟ β ⎝ ⎠ 34
(Sound) Inference Rules 37
Sequence of Inference Steps α & β α & β α & β 1. 1. 1. α ⇒ γ α ⇒ γ α ⇒ γ 2. 2. 2. &E 1 MP 4,2 β & γ ⇒ δ β & γ ⇒ δ β & γ ⇒ δ 3. 3. 3. ⇒ α α ⇒ 4. 4. β β 5. 5. γ 6. α & β α & β 1. 1. α ⇒ γ α ⇒ γ 2. 2. β & γ ⇒ δ β & γ ⇒ δ 3. 3. MP 7,3 &I 5,6 α α 4. 4. ⇒ ⇒ β β 5. 5. γ γ 6. 6. β & γ β & γ 7. 7. δ 38 8.
Sequence of Inference Steps α & β 1. Exactly the same worlds!! α ⇒ γ 2. M So if believe FIRST, β & γ ⇒ δ 3. must believe SECOND! α & β 1. α ⇒ γ 2. β & γ ⇒ δ 3. M α 4. β 5. γ 6. β & γ 7. δ 39 8.
Answering Queries � Adding Truths (Forward Chaining) Given KB 0 , find KB N s.t. ⊨ ( If { ri j } j sound, then KB 0 KB N ) � Answering Questions (Backward Chaining) ⊨ ? Given KB, σ determine if KB 0 σ Requires sound { ri j s.t. } j � � σ ∈ KB N 40
Forward Chaining Query: Animal ? KB 1 KB 1 Zebra Zebra � � Zebra ⇒ Medium Zebra ⇒ Medium � � KB 1 KB 1 Zebra ⇒ Striped Zebra ⇒ Striped � � Med Zebra ⇒ Mammal Zebra ⇒ Mammal � � Medium ⇒ NonSmall Medium ⇒ NonSmall � � Medium ⇒ NonLarge Medium ⇒ NonLarge � � Striped ⇒ NonSolid Striped ⇒ NonSolid � � Striped ⇒ NonSpot Striped ⇒ NonSpot � � Mammal ⇒ Animal Mammal ⇒ Animal � � Mammal ⇒ Warm Mammal ⇒ Warm � � ... ... � � 42
Recommend
More recommend