A Cognitive Framework for Delegation to an Assistive User

Download Report

Transcript A Cognitive Framework for Delegation to an Assistive User

A Cognitive Framework for
Delegation to an Assistive
User Agent
Karen Myers and Neil Yorke-Smith
Artificial Intelligence Center, SRI International
Overview





CALO: a learning cognitive assistant
User delegation of tasks to CALO
Delegative BDI agent framework
Goal adoption and commitments
Summary and research issues
CALO: Cognitive Assistant that Learns and
Organizes
Help manage time and
commitments
Track execution of
project tasks
Perform tasks
in collaboration
with the user

CALO supports a high-level knowledge worker




Understands the “office world”, your projects and schedule
Performs delegated tasks on your behalf
Works with you to complete tasks
Stays with you (and learns) over long periods of time


Learns to anticipate and fulfill your needs
Learns your preferred way of working
CALO Year 2
Overview





CALO: a learning cognitive assistant
User delegation of tasks to CALO
Delegative BDI agent framework
Goal adoption and commitments
Summary and research issues
Delegation May Lead to Conflicts

Focus on delegation of tasks from user to CALO



Not on tasks to be performed in collaboration
One aspect of CALO’s role as intelligent assistant
CALO cannot act if conflicts over actions

Conflicts in tasks



“purchase this computer on my behalf”
“register me for the Fall Symposium”
Conflicts in guidance


“always ask for permissions by email”
“never use email for sensitive purchases”
Conflicts in User’s Desires




“I wish to be thin”
“I wish to eat chocolate”
But Richard Waldinger’s
scotch mocha brownies
are full of calories
 conflict between incompatible desires



User’s desires conflict with each other
Humans seem to have no problem with such conflicts
CALO must recognize and respond appropriately
Other Types of Conflicts

Current and new commitments

Currently CALO is undertaking tasks to:






Purchase an item of computer equipment
Register user for a conference
Now user tasks CALO to register for a second conference
Set of new goals is logically consistent and coherent
But infeasible because insufficient discretionary funds
Commitments and advice



User tasks CALO to schedule visitor’s seminar in best
conference room
Existing advice: “Never change a booking in the
auditorium without consulting me”
New goal and existing advice are inconsistent
The BDI Framework

CALO’s ability to act is based on BDI framework




Beliefs = informational attitudes about the world
Desires = motivational attitudes on what to do
Intentions = deliberative commitments to act
Realized in the SPARK agent system


Hierarchical, procedural reasoning framework
BDI components in SPARK represented as:




Facts (beliefs)
Intentions (goals/intentions)
Desires are not represented
Procedures are plans to achieve intentions
Desires vs. Goals


Both are motivational attitudes
Desires may be neither coherent (with beliefs)
nor consistent (with each other)


Desires are ‘wishes’; goals are ‘wants’



Goals must be both
“I wish to be thin and I wish to eat chocolate”
“I want to have another of Richard’s brownies”
Desires lead to goals

CALO’s primary desire: satisfy its user

Secondary desires→goals to do what user asks
‘BDI’ Agents are Really ‘BGI’




Decision theory emphasizes B and D
AI agent theory emphasizes B and I
In most BDI literature, ‘Desires’ and ‘Goals’ are
confounded
In practice, focus is on:



goal and then intention selection
option generation, and plan execution and scheduling
Focus has been much less on:



deliberating over desires
goal generation
advisability
vital for CALO
The Problem with BGI

When Desires and Goals are unified into a single
motivational attitude:



Can’t support conflicting D/G (and D/B)
Hard to express goal generation
Hard to diagnose and resolve conflicts




Between D/G and I, and between G, I, and plans
Hard to handle conflicts in advice
How can CALO make sense of the user’s taskings
in order to act upon them?
How can CALO recognize and respond to
(potential) conflicts?
Overview





CALO: a learning cognitive assistant
User delegation of tasks to CALO
Delegative BDI agent framework
Goal adoption and commitments
Summary and research issues
Cognitive Models for Delegation
user
Belief
Desire
agent
+
+
Duser
Dagent
decision
making
(do assigned tasks)
Candidate Goals
+
Guser
Goal
all
Bagent satisfy
tasks
alignment
Buser
delegation
refinement
Adopted Goals
GCagent
goal adoption
GA
Delegative BDI Agent Architecture
user
Goal Advice
Execution Adviceagent
AG
advice
AE
failure
conflicts
Candidate Goals
Adopted
Goals
Intentions
B
GC
GA
D
I
execute
sub-goaling
revision
G
B
Overview





CALO: a learning cognitive assistant
User delegation of tasks to CALO
Delegative BDI agent framework
Goal adoption and commitments
Summary and research issues
Requirements on Goal Adoption



Self-consistency: GA must be mutually consistent
Coherence: GA must be mutually consistent
relative to the current beliefs B
Feasibility: GA must be mutually satisfiable
relative to current intentions I and available plans


Includes resource feasibility
Reasonableness: GA should be mutually
‘reasonable’ with respect to current B and I

Common sense check: did you really mean to purchase a
second laptop computer today?
Responding to Conflicting Desires

Goal adoption process should admit:




Adopting, suspending, or rejecting candidate goals
Modifying adopted goals and/or intentions
Modifying beliefs (by acting to change world state)
Example: User desires to attend a conference in
Europe but lacks sufficient discretionary funds



shorten a previously scheduled trip
cancel the planned purchase of a new laptop
or apply for a travel grant from the department
Combined Commitment Deliberation

Goal adoption


Intention reconsideration




Adopted Goals  Candidate Goals ( Desires)
Extended agent life-cycle
Non-adopted Candidate Goals
Execution problems with Adopted Goals
Propose combined commitment deliberation
mechanism


Based on agent’s deliberation over its mental states
Bounded rationality: as far as the agent believes and
can compute
BDI Control Cycle
commitment deliberation
world state
changes
identify changes
to mental state
perform
actions
decide on
response
commitment
deliberation
Mental State Transitions

Current mental state S = (B,GC,GA,I)



act
Omit D since suppose single “satisfy user” desire
Outcome of deliberation is new state S'
Possible new transitions:

Expansion



drop adopted goal + intention
To enable a different goal in the future
Proactive

adopt additional goal
No modification to existing goals or intentions
Revocation


observe decide
create new candidate goal and adopt it
To enable a current candidate goal in the future
Plus standard BGI transitions

E.g. drop an intention due to plan failure
Goal and Intention Attributes
Goals:

User-specified value/utility




Can be time-varying
Intentions:

Implied value/utility

Cost of change
User-specified priority
User-specified deadline
Estimate cost to achieve





Level of commitment so far


Level of commitment
Level of effort so far

For adopted goals


Deliberative effort
Loss of utility
Delay
E.g. estimated %
complete
Estimated cost to complete
Estimated prob. success
Making the Best Decision

S→S' transition as multi-criteria optimization


Maximize (minimize) some combination of criteria over S
Can be simple or complex



Advice acts as constraints  constrained (soft)
multi-criteria optimization problem


“Don’t drop any intention > 70% complete”
Assistive agent can consult user if no clear best S'


Bounded rationality
Simple default strategy, customizable by user
“Should I give up on purchasing a laptop, in order to
satisfy your decision to travel to both conferences?”
Learn and refine model of user’s preferences
Example

Candidate goals:



Adopted goals and intentions:



g1 with intention i1: “Purchase a high-end laptop using
general funds”
g2 with intention i2: “Attend AAAI and its workshops,
staying in conference hotel”
New candidate goal from user:


c1: “Purchase a laptop”
c2: “Attend AAAI”
c3: “Attend AAMAS” (high priority)
Mental state S = (B, {c1,c2,c3}, {g1,g2}, {i1,i2})
Example (cont.)

CALO finds cannot adopt c3


{g1,g2,g3} resource contention – insufficient general funds
Options include:
1.
2.
3.
Do not adopt c3 (don’t attend AAMAS)
Drop c1 or c2 (laptop purchase or AAAI attendance)
Modify g2 to attend only the main AAAI conference

4.

But changing i2 incurs a financial penalty
Adopt a new candidate goal c4 to apply for a
departmental travel grant
Advice prohibits option 2
Example (cont.)

CALO builds optimization problem and solves it

Problem constructed and solution method employed both
depend on agent’s nature



Finds best is tie between options 3 and 4



E.g. ignore % of intention completed
No more than 10ms to solve
Agent’s strategy (based on user guidance) is to consult
user over which to do
User instructs CALO to do both options
New mental state
S' = (B', {c1,c2,c3,c4}, {g1,g'2,g3,g4}, {i1,i'2})
Overview





CALO: a learning cognitive assistant
User delegation of tasks to CALO
Delegative BDI agent framework
Goal adoption and commitments
Summary and research issues
Summary

CALO acts as user’s intelligent assistant



Proposed delegative BDI agent framework






Classical BDI framework inadequate
Implemented BDI systems lack formal grounding
Separate Desires and Goals
Separate Candidate and Adopted Goals
Incorporate user guidance and preferences
Combined commitment deliberation for goal adoption and
intention reconsideration
Enables reasoning necessary for an agent such as CALO
Implemented by extending SPARK agent
framework
Related Work

BOID framework [Broersen et al]


BDGICTL logic [Dastani et al]




Different types of agents based on B/D/G/I conflict
resolution strategies
Merging desires into goals
Intention reconsideration [Schut et al]
Collaborative problem solving [Leveque and
Cohen; Allen and Ferguson]
Social norms and obligations [Dignum et al]
Future Work



Extend goal reasoning to consider resource
feasibility (in progress)
Proactive goal anticipation and adoption
Collaborative human-CALO problem solving

Beyond (merely) completing user-delegated tasks

Multi-CALO coordination and teamwork
Learning as part of CALO’s extended life-cycle

More information: http://calo.sri.com/
