larry holder school of eecs washington state university
play

Larry Holder School of EECS Washington State University Artificial - PowerPoint PPT Presentation

Larry Holder School of EECS Washington State University Artificial Intelligence 1 } Knowledge base T ELL agent about the environment } Knowledge representation First-order logic Many others } Reasoning via inference A SK agent


  1. Larry Holder School of EECS Washington State University Artificial Intelligence 1

  2. } Knowledge base ◦ T ELL agent about the environment } Knowledge representation ◦ First-order logic ◦ Many others… } Reasoning via inference ◦ A SK agent how to achieve goal based on current knowledge Artificial Intelligence 2

  3. function KB-A GENT ( percept ) returns an action persistent : KB , a knowledge base t , a counter, initially 0, indicating time T ELL (KB, M AKE -P ERCEPT -S ENTENCE ( percept , t )) action ← A SK (KB, M AKE -A CTION -Q UERY (t )) T ELL (KB, M AKE -A CTION -S ENTENCE ( action , t )) t ← t + 1 return action Artificial Intelligence 3

  4. } Goals ◦ Visit safe locations ◦ Grab gold if present ◦ If have gold or no more safe, unvisited locations, then move to [1,1] and Climb Artificial Intelligence 4

  5. 4 3 2 1 1 2 3 4 Artificial Intelligence 5

  6. 4 3 2 1 1 2 3 4 Artificial Intelligence 6

  7. 4 3 2 1 1 2 3 4 Artificial Intelligence 7

  8. } Percept 1 = [None,None,None,None,None] ◦ [1,2] and [2,1] safe } Action = GoForward } Percept 2 = [None,Breeze,None,None,None] } Either [2,2] or [3,1] or both has a pit } Execute TurnLeft, TurnLeft, GoForward, TurnRight, GoForward Artificial Intelligence 8

  9. OK OK Percept 7 = [Stench,None,None,None,None] } Wumpus in [1,3] ◦ No pit in [2,2] (safe), so pit in [3,1] ◦ Could Shoot, but <TurnRight,GoForward> to [2,2] } Percept 9 = [None,None,None,None,None] } [3,2] and [2,3] are safe ◦ <TurnLeft,GoForward> to [2,3] } Percept 11 = [Stench,Breeze,Glitter,None,None] } Grab gold, head home, and Climb (score: 1000 – 17 = 983) } Artificial Intelligence 9

  10. } A knowledge base (KB) consists of “sentences” } Syntax specifies a well-formed sentence } Semantics specifies the meaning of a sentence } Example ◦ Syntax: Wumpus(2,2) ◦ Semantics: Wumpus is in location (2,2) Artificial Intelligence 10

  11. } Logical inference is the process of inferring one sentence is true from others } Inference should be sound or truth- preserving ◦ Everything inferred is true } Inference should be complete ◦ Everything that is true can be inferred Artificial Intelligence 11

  12. } Pro Propositional logic assumes world consists of facts that are either true, false or unknown ◦ E.g., Wumpus(1,3) ⇒ Stench(1,2) } Fi First-or order log logic ic assumes world consists of facts, objects and relations that are either true, false or unknown ◦ E.g., Wumpus(x,y) ^ Adjacent(x,y,w,z) ⇒ Stench(w,z) } Tem Temporal logic = FOL where facts hold at particular times ◦ E.g., Before(Action(Shoot), Percept(Scream)) } Hi Higher-or order log logic ic assumes world includes first-order relations as objects ◦ E.g., Know( [ Wumpus(x,y) ^ Adjacent(x,y,w,z) ⇒ Stench(w,z) ] ) } Pro Probabilistic logic = propositional logic with a degree of belief for each fact ◦ E.g., P(Wumpus(1,3)) = 0.067 Artificial Intelligence 12

  13. } Or, First-Order Predicate Calculus (FOPC) } Borrowing from elements of natural language ◦ Objects: nouns, noun phrases (e.g., wumpus, pit) ◦ Relations: verbs, verb phrases (e.g., shoot) – Properties: adjectives (e.g., smelly) – Functions: map input to single output (e.g., location(wumpus)) Artificial Intelligence 13

  14. Sentence → AtomicSentence | ComplexSentence AtomicSentence → Predicate | Predicate (Term,...) | Term = Term ComplexSentence → (Sentence) | [Sentence] | ¬ Sentence | Sentence ∧ Sentence | Sentence ∨ Sentence | Sentence ⇒ Sentence | Sentence ⇔ Sentence | Quantifier Variable,... Sentence Term → Function (Term,...) | Constant | Variable Quantifier → " | $ Constant → A | B | Wumpus | 1 | 2 | ... Variable → a | x | s | ... Predicate → True | False | Adjacent | At | Alive | ... Function → Location | RightOf | ... Operator Precedence: ¬, =, ∧ , ∨ , ⇒ , ⇔ Artificial Intelligence 14

  15. } Not (¬) is a negation } Literal is either an atomic sentence (positive literal) or a negated atomic sentence (negative literal) ◦ ¬Breeze(1,1), Breeze(2,1) } And ( ∧ ) is a conjunction; its parts are conjuncts } Or ( ∨ ) is a disjunction; its parts are disjuncts } Implies ( ⇒ ) is an implication ◦ Pit(2,2) ⇒ Breeze(1,2) ◦ Lefthand side is the antecedent or premise ◦ Righthand side is the consequent or conclusion } If and only if ( ⇔ ) is a biconditional Artificial Intelligence 15

  16. } How to determine the truth value (true or false) of every sentence } True is always true } False is always false } Truth values of every other sentence must be specified directly or inferred ◦ E.g., Wumpus(2,2) is true, Wumpus(3,3) is false Artificial Intelligence 16

  17. } Semantics for complex sentences ◦ ¬P is true iff P is false ◦ P ∧ Q is true iff both P and Q are true ◦ P ∨ Q is true iff either P or Q is true ◦ P ⇒ Q is true unless P is true and Q is false ◦ P ⇔ Q is true iff P and Q are both true or both false P Q ¬P P ∧ Q P ∨ Q P ⇒ Q P ⇔ Q Truth false false true false false true true Table false true true false true true false true false false false true false false true true false true true true true Artificial Intelligence 17

  18. } Constant symbols stand for objects } Predicate symbols stand for relations } Function symbols stand for functions } R&N convention: Above symbols begin with uppercase letters ◦ E.g., Wumpus, Adjacent, RightOf } Arity is the number of arguments to a predicate or function ◦ E.g., Adjacent (loc 1 , loc 2 ), RightOf (location) Artificial Intelligence 18

  19. } Terms represent objects with constants, variables or functions } Note: Functions do not return an object, but represent that object ◦ E.g., Action(GoForward,t) ∧ Orientation(Agent, Right, t) ∧ At(Agent, loc, t) ⇒ At(Agent, RightOf(loc), t+1) } R&N convention: variables begin with lowercase letters Artificial Intelligence 19

  20. } Express properties of collections of objects } Universal quantification ( " ) ◦ A statement is true for all objects represented by quantified variables ◦ E.g., " x,y At(Wumpus,x,y) ⇒ Stench(x+1,y) ◦ Same as " x,y At(Wumpus,x,y) ∧ Stench(x+1,y) ? ◦ Same as " x,y ¬At(Wumpus,x,y) ∨ Stench(x+1,y) ? } " x P(x) ≡ P(A) ∧ P(B) ∧ P(Wumpus) ∧ ... Artificial Intelligence 20

  21. } Existential quantification ( $ ) ◦ There exists at least one set of objects, represented by quantified variables, for which a statement is true ◦ E.g., $ w,x,y At(w,x,y) ∧ Wumpus(w) ◦ Same as $ w,x,y At(w,x,y) ⇒ Wumpus(w) ? } $ x P(x) ≡ P(A) ∨ P(B) ∨ P(Wumpus) ∨ ... Artificial Intelligence 21

  22. } Nested quantifiers } " x " y same as " y " x same as " x,y } $ x $ y same as $ y $ x same as $ x,y } $ x " y same as " y $ x ? ◦ $ x " y Likes(x,y) ? ◦ " y $ x Likes(x,y) ? ◦ " x $ y Likes(x,y) ? ◦ $ y " x Likes(x,y) ? Artificial Intelligence 22

  23. } Negation and quantifiers } $ x P(x) ≡ ¬ " x ¬P(x) ◦ “If P is true for some x, then P can’t be false for all x” } " x P(x) ≡ ¬ $ x ¬P(x) ◦ “If P is true for all x, then there can’t be an x for which P is false.” } " x ¬P(x) ≡ ¬ $ x P(x) ◦ “If P is false for all x, then there can’t be an x for which P is true.” } ¬ " x P(x) ≡ $ x ¬P(x) ◦ “If P is not true for all x, then there must be an x for which P is false.” Artificial Intelligence 23

  24. } Equality symbol (Term1 = Term1) means Term1 and Term2 refer to the same object ◦ E.g., RightOf(Location(1,1)) = Location(2,1) } Useful for constraining two terms to be different } E.g., Sibling ◦ Sibling(x,y) ⇔ Parent(p,x) ∧ Parent(p,y) ◦ Sibling(x,y) ⇔ Parent(p,x) ∧ Parent(p,y) ∧ ¬(x = y) ◦ " x,y Sibling(x,y) ⇔ $ p Parent(p,x) ∧ Parent(p,y) ∧ ¬(x = y) Artificial Intelligence 24

  25. } Closed-world assumption ◦ Atomic sentences not known to be true are assumed false } Unique-names assumption ◦ Every constant symbol refers to a distinct object } Domain closure ◦ If not named by a constant symbol, then doesn’t exist Artificial Intelligence 25

  26. } T ELL (KB, α ) ◦ T ELL (KB, Percept([st,br,Glitter,bu,sc],5)) } A SK (KB, β ) ◦ A SK (KB, ∃ a Action(a,5)) ◦ I.e., does KB entail any particular actions at time 5? ◦ Answer: Yes, {a/Grab} ß substitution (binding list) } A SK V ARS (KB, α ) ◦ Returns answers (variable bindings) that make α true ◦ Or, use Answer literal (later) ◦ A SK (KB, ∃ a Action(a,5) ^ Answer(a)) Artificial Intelligence 26

  27. } Percepts ◦ Percept(p,t) = predicate that is true if percept p observed at time t ◦ Percept is a list of five terms ◦ E.g., Percept([Stench,Breeze,Glitter,None,None],5) } Actions ◦ GoForward, TurnLeft, TurnRight, Grab, Shoot, Climb } AskVars ( ∃ a BestAction(a,5)) à {a/Grab} Artificial Intelligence 27

  28. } “Perception” ◦ ∀ t,s,g,m,c Percept([s,Breeze,g,m,c],t) ⇒ Breeze(t) ◦ ∀ t,s,b,m,c Percept([s,b,Glitter,m,c],t) ⇒ Glitter(t) } Reflex agent ◦ ∀ t Glitter(t) ⇒ BestAction(Grab,t) Artificial Intelligence 28

  29. } Location list term [x,y] (e.g., [1,2]) ◦ Pit(s) or Pit([x,y]) ◦ At(Wumpus,[x,y],t) ◦ At(Agent,[1,1],1) } Definition of Breezy(s), where s is a location ◦ ∀ s Breezy(s) ⇔ ∃ r Adjacent(s,r) ∧ Pit(r) } Definition of Adjacent ◦ ∀ x,y,a,b Adjacent([x,y],[a,b]) ⇔ (x=a ∧ (y=b-1 ∨ y=b+1)) ∨ (y=b ∧ (x=a-1 ∨ x=a+1)) Artificial Intelligence 29

Recommend


More recommend