a cognitive framework for delegation to an assistive user
play

A Cognitive Framework for Delegation to an Assistive User Agent - PowerPoint PPT Presentation

A Cognitive Framework for Delegation to an Assistive User Agent Karen Myers and Neil Yorke-Smith Artificial Intelligence Center, SRI International Overview CALO: a learning cognitive assistant User delegation of tasks to CALO


  1. A Cognitive Framework for Delegation to an Assistive User Agent Karen Myers and Neil Yorke-Smith Artificial Intelligence Center, SRI International

  2. Overview � CALO: a learning cognitive assistant � User delegation of tasks to CALO � Delegative BDI agent framework � Goal adoption and commitments � Summary and research issues

  3. CALO: Cognitive Assistant that Learns and Organizes Track execution of Help m anage tim e and project tasks com m itm ents Perform tasks in collaboration w ith the user CALO supports a high-level knowledge worker � Understands the “office world”, your projects and schedule � Performs delegated tasks on your behalf � Works with you to complete tasks � Stays with you (and learns) over long periods of time � Learns to anticipate and fulfill your needs � Learns your preferred way of working �

  4. CALO Year 2

  5. Overview � CALO: a learning cognitive assistant � User delegation of tasks to CALO � Delegative BDI agent framework � Goal adoption and commitments � Summary and research issues

  6. Delegation May Lead to Conflicts � Focus on delegation of tasks from user to CALO � Not on tasks to be performed in collaboration � One aspect of CALO’s role as intelligent assistant � CALO cannot act if conflicts over actions � Conflicts in tasks � “purchase this computer on my behalf” � “register me for the Fall Symposium” � Conflicts in guidance � “always ask for permissions by email” � “never use email for sensitive purchases”

  7. Conflicts in User’s Desires � “I wish to be thin” � “I wish to eat chocolate” � But Richard Waldinger’s scotch mocha brownies are full of calories � ⇒ conflict between incompatible desires � User’s desires conflict with each other � Humans seem to have no problem with such conflicts � CALO must recognize and respond appropriately

  8. Other Types of Conflicts � Current and new commitments � Currently CALO is undertaking tasks to: � Purchase an item of computer equipment � Register user for a conference � Now user tasks CALO to register for a second conference � Set of new goals is logically consistent and coherent � But infeasible because insufficient discretionary funds � Commitments and advice � User tasks CALO to schedule visitor’s seminar in best conference room � Existing advice: “Never change a booking in the auditorium without consulting me” � New goal and existing advice are inconsistent

  9. The BDI Framework � CALO’s ability to act is based on BDI framework � B eliefs = informational attitudes about the world � D esires = motivational attitudes on what to do � I ntentions = deliberative commitments to act � Realized in the SPARK agent system � Hierarchical, procedural reasoning framework � BDI components in SPARK represented as: � Facts (beliefs) � Intentions (goals/intentions) � Desires are not represented � Procedures are plans to achieve intentions

  10. Desires vs. Goals � Both are motivational attitudes � Desires may be neither coherent (with beliefs) nor consistent (with each other) � Goals must be both � Desires are ‘wishes’; goals are ‘wants’ � “I wish to be thin and I wish to eat chocolate” � “I want to have another of Richard’s brownies” � Desires lead to goals � CALO’s primary desire: satisfy its user � Secondary desires → goals to do what user asks

  11. ‘BDI’ Agents are Really ‘BGI’ � Decision theory emphasizes B and D � AI agent theory emphasizes B and I � In most BDI literature, ‘Desires’ and ‘Goals’ are confounded � In practice, focus is on: � goal and then intention selection � option generation, and plan execution and scheduling � Focus has been much less on: � deliberating over desires � goal generation vital for CALO � advisability

  12. The Problem with BGI � When Desires and Goals are unified into a single motivational attitude: � Can’t support conflicting D/G (and D/B) � Hard to express goal generation � Hard to diagnose and resolve conflicts � Between D/G and I, and between G, I, and plans � Hard to handle conflicts in advice � How can CALO make sense of the user’s taskings in order to act upon them? � How can CALO recognize and respond to (potential) conflicts?

  13. Overview � CALO: a learning cognitive assistant � User delegation of tasks to CALO � Delegative BDI agent framework � Goal adoption and commitments � Summary and research issues

  14. Cognitive Models for Delegation user agent satisfy all alignment B user B agent Belief tasks + + D user D agent Desire (do assigned tasks) decision Candidate Goals making + delegation G C G user agent refinement Adopted Goals Goal goal adoption G A

  15. Delegative BDI Agent Architecture user agent Goal Advice Execution Advice A G A E advice failure Candidate Goals Adopted Goals conflicts Intentions B G C G A I execute D sub-goaling revision B G

  16. Overview � CALO: a learning cognitive assistant � User delegation of tasks to CALO � Delegative BDI agent framework � Goal adoption and commitments � Summary and research issues

  17. Requirements on Goal Adoption � Self-consistency: G A must be mutually consistent � Coherence: G A must be mutually consistent relative to the current beliefs B � Feasibility: G A must be mutually satisfiable relative to current intentions I and available plans � Includes resource feasibility � Reasonableness: G A should be mutually ‘reasonable’ with respect to current B and I � Common sense check: did you really mean to purchase a second laptop computer today?

  18. Responding to Conflicting Desires � Goal adoption process should admit: � Adopting, suspending, or rejecting candidate goals � Modifying adopted goals and/or intentions � Modifying beliefs (by acting to change world state) � Example: User desires to attend a conference in Europe but lacks sufficient discretionary funds � shorten a previously scheduled trip � cancel the planned purchase of a new laptop � or apply for a travel grant from the department

  19. Combined Commitment Deliberation � Goal adoption � Adopted Goals ≠ Candidate Goals ( ≠ Desires) � Intention reconsideration � Extended agent life-cycle � Non-adopted Candidate Goals � Execution problems with Adopted Goals � Propose combined commitment deliberation mechanism � Based on agent’s deliberation over its mental states � Bounded rationality: as far as the agent believes and can compute

  20. BDI Control Cycle commitment deliberation world state changes decide on response identify changes to mental state perform actions

  21. commitment deliberation Mental State Transitions decide observe act � Current mental state S = (B,G C ,G A ,I) � Omit D since suppose single “satisfy user” desire � Outcome of deliberation is new state S' � Possible new transitions: adopt additional goal � Expansion � No modification to existing goals or intentions drop adopted goal + intention � Revocation � To enable a different goal in the future create new candidate goal and adopt it � Proactive � To enable a current candidate goal in the future � Plus standard BGI transitions � E.g. drop an intention due to plan failure

  22. Goal and Intention Attributes Goals: Intentions: User-specified value/utility Implied value/utility � � Can be time-varying Cost of change � � User-specified priority Deliberative effort � � Loss of utility User-specified deadline � � Delay Estimate cost to achieve � � Level of commitment � Level of effort so far Level of commitment so far � � E.g. estimated % For adopted goals � � complete Estimated cost to complete � Estimated prob. success �

  23. Making the Best Decision � S → S' transition as multi-criteria optimization � Maximize (minimize) some combination of criteria over S � Can be simple or complex � Bounded rationality � Simple default strategy, customizable by user � Advice acts as constraints ⇒ constrained (soft) multi-criteria optimization problem � “Don’t drop any intention > 70% complete” � Assistive agent can consult user if no clear best S' � “Should I give up on purchasing a laptop, in order to satisfy your decision to travel to both conferences?” � Learn and refine model of user’s preferences

  24. Example � Candidate goals: � c 1 : “Purchase a laptop” � c 2 : “Attend AAAI” � Adopted goals and intentions: � g 1 with intention i 1 : “Purchase a high-end laptop using general funds” � g 2 with intention i 2 : “Attend AAAI and its workshops, staying in conference hotel” � New candidate goal from user: � c 3 : “Attend AAMAS” (high priority) � Mental state S = (B, {c 1 ,c 2 ,c 3 }, {g 1 ,g 2 }, {i 1 ,i 2 })

  25. Example (cont.) CALO finds cannot adopt c 3 � {g 1 ,g 2 ,g 3 } resource contention – insufficient general funds � Options include: � Do not adopt c 3 (don’t attend AAMAS) 1. Drop c 1 or c 2 (laptop purchase or AAAI attendance) 2. Modify g 2 to attend only the main AAAI conference 3. But changing i 2 incurs a financial penalty � Adopt a new candidate goal c 4 to apply for a 4. departmental travel grant Advice prohibits option 2 �

Recommend


More recommend