toward proof theoretic semantics for the deontic
play

Toward proof-theoretic semantics for the deontic cognitive event - PowerPoint PPT Presentation

Logic and Cognition Workshop ICLA 2019, Delhi March 2, 2019 Toward proof-theoretic semantics for the deontic cognitive event calculus Naveen Sundar Govindarajulu, Selmer Bringsjord Rensselaer AI & Reasoning Lab Rensselaer Polytechnic


  1. Logic and Cognition Workshop ICLA 2019, Delhi March 2, 2019 Toward proof-theoretic semantics for the deontic cognitive event calculus Naveen Sundar Govindarajulu, Selmer Bringsjord Rensselaer AI & Reasoning Lab Rensselaer Polytechnic Institute (RPI) Troy, New York, 12180, USA www.rpi.edu Sponsored by

  2. Overview • Goal: Handle automation of Arrow’s theorem and similar results when applied to aggregation over cognitive states. • Our logic/tool: deontic cognitive event calculus (DCEC) • This talk: proof-theoretic semantics for a fragment of DCEC

  3. Arrow’s Theorem (very briefly) • Without a dictator in sway, • it is impossible for a group of agents to have their individual preferences aggregated to yield preferences for the group as a whole • (with certain other desirable conditions). • First applied to voting over discrete finite choices

  4. Arrow’s Theorem

  5. Arrow’s Theorem • Also applies to judgements of propositions.

  6. Arrow’s Theorem • Also applies to judgements of propositions. • Agents speculating on the value of propositions

  7. Arrow’s Theorem • Also applies to judgements of propositions. • Agents speculating on the value of propositions • General case, we have a set of agents supplying propositions that are quite complex and conflicting.

  8. Arrow’s Theorem • Also applies to judgements of propositions. • Agents speculating on the value of propositions • General case, we have a set of agents supplying propositions that are quite complex and conflicting.

  9. Arrow’s Theorem

  10. Arrow’s Theorem • Goal : Build a benevolent AI dictator that can merge complex beliefs from di ff erent agents (beliefs can be about other beliefs etc) using DCEC.

  11. Arrow’s Theorem • Goal : Build a benevolent AI dictator that can merge complex beliefs from di ff erent agents (beliefs can be about other beliefs etc) using DCEC.

  12. Need • A logic that can handle beliefs, knowledge, intentions, obligations, desires and other modalities

  13. Our Tool Deontic Cognitive Event Calculus

  14. DCEC ∗ e DCEC ∗ DCEC CEC µ C CC The deontic cognitive event calculus is one member in the cognitive calculi family.

  15. Cognitive Caluli: briefly • Are quantified multi-sorted modal logic.

  16. Why quantified multi-sorted modal logic? Reasoning Crudely Split Intensional (Modal) Reasoning Extensional Reasoning Theory of mind reasoning Math Physics Chemistry … Groceries Driving a car …

  17. A Few Applications of Cognitive Calculi • False belief task • Arkoudas, Konstantine, and Selmer Bringsjord. "Toward Formalizing Common-sense Psychology: An Analysis of the False-belief Task." PRICAI 2008 : Trends in Artificial Intelligence (2008): 17-29. Expanded: “Propositional Attitudes and Causation” Int. J. Software & Informatics , 3.1 : 47–65, 2009. • Self-awareness/consciousness • Mirror task • Bringsjord, Selmer, and Naveen Sundar Govindarajulu. "Toward a Modern Geography of Minds, Machines, and Math." In Philosophy and Theory of Artificial Intelligence , pp. 151-165. Springer Berlin Heidelberg, 2013. • Floridi’s KG4 (earlier: Wise Man Puzzle, including infinitized WMP) • Bringsjord, Selmer, John Licato, Naveen Sundar Govindarajulu, Rikhiya Ghosh, and Atriya Sen. "Real Robots that Pass Human Tests of Self-consciousness." In Robot and Human Interactive Communication (RO-MAN), 2015 24th IEEE International Symposium on, pp. 498-504. IEEE, 2015. • Moral Cognition • Akrasia • Bringsjord, Selmer, G. Naveen Sundar, Dan Thero, and Mei Si. “Akratic Robots and the Computational Logic Thereof." In Proceedings of the IEEE 2014 International Symposium on Ethics in Engineering, Science, and Technology , p. 7. IEEE Press, 2014. • Doctrine of Double Effect • Govindarajulu, Naveen Sundar, and Selmer Bringsjord. "On Automating the Doctrine of Double Effect." International Joint Conference on AI (IJCAI 2017) • Govindarajulu, Naveen Sundar, and Selmer Bringsjord. “Beyond the Doctrine of Double Effect: A Formal Model of True Self-Sacrifice” International Conference on Robot Ethics and Safety Systems (ICRESS 2017) • Virtue Ethics • Govindarajulu, Naveen Sundar, Selmer Bringsjord, Rikhiya Ghosh and Vasanth Sarathy. “Towards Virtuous Machines” AAAI/ACM Conference on AI Ethics and Society (AIES 2019)

  18. The Doctrine of Double Effect

  19. IJCAI 2017; Autonomy Track

  20. AAAI/ACAM Conference on AI, Ethics and Society 2019 Toward the Engineering of Virtuous Machines Naveen Sundar Govindarajulu, Vasanth Sarathy Selmer Bringsjord and Rikhiya Ghosh Human Robot Interaction Laboratory Rensselaer AI & Reasoning Lab Tufts University Rensselaer Polytechnic Institute (RPI) Medford, MA, 02155, USA Troy, New York, 12180, USA www.tufts.edu www.rpi.edu

  21. The Formalization (Overview) ( Q 1 ) Virtuous Person V n ( s ) ↔ ∃ ≥ n a : Exemplar ( s, a ) ( Q 2 ) Virtue G n ( τ ) ↔ ∃ ≥ n a : Trait ( τ , a ) ¬ ( R 1 ) Admiration in DCEC ( R 2 ) Inference Schema for Trait ( R 3 ) Learning a Trait holds ( admires ( a, b, α ) , t ) ) n ( σ i , happens ( action ( α i , a ) , t i ) $ 2 3 Exemplar ( e, l ) ^ Θ ( a, t 0 ) ^ 0 1 LearnTrait ( l, τ , t ) $ 9 e � � σ i ( t ) = σ , g ( α i ) = α g ⇣ �⌘ 4 5 � B l, t, Trait τ , e ( a 6 = b ) ^ ( t 0 < t ) B C i =1 0 2 3 1 [ I Trait ] B C B C Trait ( τ , a ) B B 6 7 C C � � ^ happens ( action ( b, α ) , t 0 ) ^ LearnTrait ( l, h σ , α i , t ) ! σ ! happens ( action ( l, α ) , t ) B @ a, t, B B 6 7 C C @ 4 5 A A ν ( action ( b, α ) , t ) > 0 Exemplar Defninition Exemplar ( e, l ) $ 9 ! n t. 9 α . holds ( admires ( l, e, α ) , t )

  22. DCEC briefly calculus. Syntax S ::= Agent | ActionType | Action v Event | Moment | Fluent  action : Agent ⇥ ActionType ! Action    initially : Fluent ! Formula      holds : Fluent ⇥ Moment ! Formula      happens : Event ⇥ Moment ! Formula   f ::= clipped : Moment ⇥ Fluent ⇥ Moment ! Formula     initiates : Event ⇥ Fluent ⇥ Moment ! Formula      terminates : Event ⇥ Fluent ⇥ Moment ! Formula      prior : Moment ⇥ Moment ! Formula  t ::= x : S | c : S | f ( t 1 , . . . , t n ) q : Formula | ¬ φ | φ ^ ψ | φ _ ψ | 8 x : φ ( x ) |    P ( a, t, φ ) | K ( a, t, φ ) |      C ( t, φ ) | S ( a, b, t, φ ) | S ( a, t, φ ) | B ( a, t, φ ) φ ::=  D ( a, t, φ ) | I ( a, t, φ )      O ( a, t, φ , ( ¬ ) happens ( action ( a ⇤ , α ) , t 0 )) 

  23. DCEC briefly Sort Description Agent Human and non-human actors. Time The Time type stands for time in the domain. E.g. simple, such as t i , or complex, such as birthday ( son ( jack )) . Event Used for events in the domain. ActionType Action types are abstract actions. They are in- stantiated at particular times by actors. Exam- ple: eating. Action A subtype of Event for events that occur as actions by agents. Fluent Used for representing states of the world in the event calculus.

  24. • Jones intends to convince Smith to believe that Jones believes that were the cat, lying in the foyer now, to be let out, it would settle, dozing, on the mat.

  25. • Jones intends to convince Smith to believe that Jones believes that were the cat, lying in the foyer now, to be let out, it would settle, dozing, on the mat. I ( j, C ( s, B ( s, B ( j, ι [ c : in ( c, ι ( f : Foyer ( f )) , m : mat ( m )] out ( c ) → subj doze ( c, m ))))

  26. • Jones intends to convince Smith to believe that Jones believes that were the cat, lying in the foyer now, to be let out, it would settle, dozing, on the mat. I ( j, C ( s, B ( s, B ( j, ι [ c : in ( c, ι ( f : Foyer ( f )) , m : mat ( m )] intensional operators out ( c ) → subj doze ( c, m ))))

  27. • Jones intends to convince Smith to believe that Jones believes that were the cat, lying in the foyer now, to be let out, it would settle, dozing, on the mat. I ( j, C ( s, B ( s, B ( j, ι [ c : in ( c, ι ( f : Foyer ( f )) , m : mat ( m )] intensional operators out ( c ) → subj doze ( c, m )))) scoped term

  28. • Jones intends to convince Smith to believe that Jones believes that were the cat, lying in the foyer now, to be let out, it would settle, dozing, on the mat. I ( j, C ( s, B ( s, B ( j, ι [ c : in ( c, ι ( f : Foyer ( f )) , m : mat ( m )] intensional operators out ( c ) → subj doze ( c, m )))) subjunctive conditional scoped term

  29. Example automated inference assume Premise 1 C (t , ∀ a,t happens(display(wealth, a), t) ⇒ holds(wealthy(a), t)) 0 from {Premise 1} assume Premise 2 C (t , P (robert, t , happens(display(wealth, host), t ))) 0 0 0 from {Premise 2} CC ⊢ G1 B (robert, t , holds(wealthy(host), t )) 1 0 CC ⊢ from {Premise 2,Premise 1} G2 B (host, t , B (robert, t , holds(wealthy(host), t ))) 2 1 0 CC ⊢ from {Premise 2,Premise 1} G3 B (robert, t , B (host, t , B (robert, t , holds(wealthy(host), t )))) 3 2 1 0 from {Premise 2,Premise 1}

  30. Semantics?

  31. Semantics? • Possible-worlds semantics is not attractive for us for a number of reasons (we are okay with possible worlds being used for necessity/possibility)

Recommend


More recommend