로봇공학 Life-like Character

Download Report

Transcript 로봇공학 Life-like Character

Ch 11. Shallow and Inner forms of
Emotional Intelligence in Advisory
Dialog Simulation
소프트컴퓨팅 연구실
황금성
1
ABSTRACT
•
A conversational agent
– Aspiring to be believable in an emotional domain
– Should be able to combine its rational and emotional intelligence
•
We claim
– Cognitive model of emotion activation may contribute it
– By providing knowledge to be employed in modeling emotion
We show
– XML markup language
– Insuring independence between the agent’s body and mind
– Adapting the dialog to the user characteristics
•
•
2
Example domain
– Eating disorder domain
1. Introduction
•
There is certainly a link between
– The type of character
– Their application domain to which it applies
– The category of users to which it is addressed
•
Ex) Advisory dialogue (about eating habits)
– In the case of children,
 Cartoon suggesting a correct behavior in some domain
 Fun illustration of the effects of healthy/unhealthy eating
– In the case of adults
 The psychological problems which go with eating disorders
require a different form of believability
 Give the impression of being expert and trustworthy
Of understanding the reasons of the interlocutors
Of adapting to their needs
 We will focus our analysis on how emotions arising
during the dialog might influence its dynamics
3
Research Of Believable Conversation
•
•
•
•
•
•
•
•
•
•
4
Problem of simulating dialogs with human - by computational linguists
[Wilks’99]
Turing test already envisioned a computer dialog in the scope of a
game - computer science [Turing’50]
The ability to show clear personality traits
[Walker ’97,Loyall’97,Castelfranchi’98]
To recognize doubt [Carberry’02]
To be polite [Ardissono’99]
To introduce a bit of humor [Stock’02]
To persuade with irrational arguments [Sillince’91]
Showing consistency between inner and outward aspects of behavior
[Marsella’03]
Human-like character rather than a cartoon in medical applications
[Marsella’03]
Flexible behavior [Prendinger’03]
2. Advisory Dialogs
•
•
•
•
5
Show an appropriate behavior
– By providing relevant information and suggestions
– By persuading them to follow them
Dialogs may be emotional
– When the affective state of the user is influenced by information
received
– When the expert reacts to the users’ answers by showing an
empathic attitude
Asymmetry: expert가 정보를 제공해야 하면서도 질문을 하므로
 Can be reduced by enabling users to drive information provision
toward their needs
To behave believably
– Agent should show some form of emotional intelligence
 Recognizing and expressing emotions
 Regulating them
 Utilizing them to optimize the dialog
2. Advisory Dialogs (Cont.)
6
•
We developed an emotion modeling method and tool
– How the agent reacts to the user’s moves
 Emotion triggering and decay
 Simulate a regulatory mechanism
– Our system
 Express emotions through face, gesture, and speech
 Most shallow form of emotional intelligence
•
How the dialog is affected by the emotional state
– By some personality traits
– By their relationship [Pelachaud’02]
– Researches: emotions influence learning, decision making and
memory [Picard’97]
2. Advisory Dialogs (Cont.)
•
•
•
Problem issues in simulating affective dialogs
– Which personality factors may affect the dialog
– Which emotions the agent may feel
– How an emotion influences the course of dialog
– How an emotion affects the way that a particular goal is rendered
3 types of relationships
– Friendship: 친밀감
– Animosity: 적의
– Empathy: 감정이입
Our advisory dialog simulator
– Domain-independent simulator
the
agent's
mind
Festival + Greta
or MS-Agent or other
MIDAS
Emotion Modeling
7
Animation Engine
Automatic Tagging
Executable-Mind
GRAPHICAL
INTERFACE
generates
an agent's
body
Dialog Manager
TRINDI
3. Two Example Application Domains
3.1 Travel Agency
•
To prove domain independence of our system
•
Extension of the kind of dialogs that were simulated in Godis,
with the introduction of some small talk in the style of REA [Bickmore’99]
Real-estate agent
8
•
Agent plays the role of a travel agent
– Provide suggestions about a holiday
•
Small talk
– Is triggered when the agent is initially set in the empathic mode
– Wants to establish a friendly relationship with the user
– By adding some comments about climate, traffic, tourist, etc.
GoDiS
9
•
An experimental dialogue system built using the toolkit TrindiKit
•
Explores and implements issue-based dialogue management
– Adapt Ginzburg’s KOS to dialogue system (GoDiS) and implement
•
Extends theory to more flexible dialogue
1. Menu based dialogue
– Action-oriented dialogue, VCR application
2. Multiple tasks, information sharing between tasks
3. Feedback and grounding
4. Accommodation, re-raising, clarification
5. Multi-linguality & mutiple domains
6. Conditional responses (Ivana Kruijff-Korbayova)
What is TrindiKit
•
•
•
10
A toolkit for
– Building and experimenting with dialogue move engines (DME) and
systems,
– Based on the information state approach
Not a dialogue system in itself
What does TrindiKit provide
– High-level formalism and interpreter for implementing dialogue
systems
 Promotes transparency, reusability, plug-and-play, etc.
 Allows implementation and comparison of dialogue theories
 Hides low-level software engineering issues
– GUI, WWW-demo
– Ready-made modules and resources
 Speech
 Interfaces to databases, devices, etc.
 Reasoning, planning
3. Two Example Application Domains
•
11
Fig. 2. An
example dialog
in the Travel
Agency
domain
3. Two Example Application Domains
3.2 Advice About Eating Disorders
•
Eating disorders are common among young people
•
Chat line or web sites
– Tends to be empathic, but not tailored to the user
– User cannot ask questions, make objections, request clarifications,
take the initiative to drive information provision
– Cannot recognize whether and how the emotional state of the user
changes during interaction
– Emotions are static and rigid
•
Aims of our agent
– Enable young users to interact with our agent
– Establish an atmosphere of empathic relationship between the two
interlocutors
– Make the advice provided more effective
12
About Eating Disorders
Example Dialog Produced By Our System
G0: Hi, my name is Greta! I’m here to help you find a healthier way of
feeding yourself. What’s your name?
U0: My name is Jana.
G1: What’s your problem, Jana?
U1: I’ve been eating too much for about one year and I realize that I’m not
myself, that I’m not thinking quite right!
G2: Oh I’m sorry. That’s a bad situation indeed. What is your age?
U2: I’m 17.
G3: Well, teenagers are notorious for being hungry frequently, because
that need to eat more to support the major growth of their bodies.
U3: Oh no, it’s not a question of increased appetite or overeating from time
to time: it’s a question of compulsive overeating, in my case. I feel out
of control and powerless to stop!
G4: How awful! Have you already tried to do something to solve this
problem?
U4: Yes, and I didn’t succeed. I now fear that stopping will not be possible
for me! So, I don’t know why I’m here!
G5: Oh no! You have to be confident in yourself! If you want to get over
this, I can help you.
13
About Eating Disorders
Example Dialog Produced By Our System (Cont.)
U5: What can I do?
G6: I have some ideas and experience on how to make a smooth, quick
transition from overeating back to healthy living. If you wish, I can
suggest what you might do.
U6: Yes, please, I want to make every possible effort to get out of this!
G7: Fine! This is the attitude I want to see from you! So, let’s start, first of
all, from cleansing your home of all “binge foods”. Whatever your
current food downfalls are, get rid of them!
U7: But I can’t stand to throw food away!
G8: Well, give it to someone, then! And if your relatives insist on keeping
such foods in the house, ask them to keep it away from you for a while.
U8: That’s a good suggestion, I’ll try! What else?
G9: I need to know something more about the way you live, to go on. Do
you take part in any kind of sport, for instance?
U9: Nothing, for a long time.
G10: So, plan to exercise! Choose something you enjoyed doing, even
though you may not feel like doing it right now.
...
14
4. Emotion Modeling
•
•
•
•
15
Algorithmic aspects of the model and prototype implementation
In this modeling method, particular attention is paid to
– How emotions change in intensity with time
– How they are mixed up
– How each of them prevails
– In a given situation, according to the agent’s personality
– To the social context in which the dialog occurs
We focused our attention on event-driven emotions in Qrtony et al.’s
theory
– Which includes positive and negative emotions
– Triggered by present or future desirable or undesirable events
We adopted Oatley and Johnson-Laird’s theory
– Positive and negative emotions are activated by the belief
that some goal is achieved or threatened as a consequence of some
event
4. Emotion Modeling (Cont.)
•
•
•
•
16
The cognitive model of emotions
– Represent the system of beliefs and goals behind emotion activation
– Has the ability to guess the reason why it feels a particular emotion
– Has the ability to justify it if needed
– Shows how the agent’s system of goals is revised as a consequence of
feeling emotion
– Shows how this revision influences the dialog dynamics
We apply a Dynamic Belief Network (DBN)
– As a goal monitoring method
– Employs observational data in the time interval (Ti, Ti+1)
– To generate a probabilistic model of the agent’s mind
– Reason about the consequences of the observed event on the monitored
goals.
We calculate the intensity of emotions as a function of the uncertainty
– Of the agent’s beliefs that its goal will be achieved
– Of the utility assigned to achieving this goal
We combined the variables
– To measure the variation in the intensity of an emotion
4. Emotion Modeling
A Portion Of The DBN: Triggering Of “Sorry-for”
Bel(FeintdOf G U)-Ti
BelG(FrinedOf G U)
not(Desirable E)
Event-BN
in (Ti,Ti+1)
Time Ti+1
(Occurs E U)
Say U not(Desirable E)
BelG not(Desirable E)
Say U(Occ E U)
BelG(Occ E U)
BelG GoalU not(Occ E U)
GoalG not(Occ E U)
Time Ti
Interval(Ti,Ti+1)
Mind-BN
at Ti+1
BelG(UnsatisfFor G U E)
BelG(Thr-GoodOf U)-Ti+1
BelG(Thr-GoodOf U)-Ti
Emotion-BN
at Ti+1
17
Feels G(SorryFor U)
4. Emotion Modeling
An Example: Slow And Fast Decay Of Emotions
18
4. Emotion Modeling
4.1 Two Versions Of Our Emotion Simulator
• An emotion simulation tool Applies in two different versions
1. Mind-Testbed
– Create and test models
– Supported files
 the agent’s mind: Mind-BN
 the events that may occur in every considered domain: EventBNs
 the relationships between goals in Mind-BN and emotions
modeled: Emotion-BNs
 The personalities the agent may take
 The contexts in which simulation may occur
2. Executable-Mind
– When the calling program inputs a user’s move
 analyzes the acts
 activate emotions in the agent
 updates the emotion intensity table with the new entry
 sends it back to the calling program
– User’s move: combination of communicative acts
19
4. Emotion Modeling
The Graphical Interface Of Mind-testbed
20
4. Emotion Modeling
4.2 A Simulation Example
•
21
Fig. 6. Emotions triggered in the example dialog, with four personalities
agent
5. Regulation And Display Of Emotions
•
•
•
•
To regulate and display its emotions
An emotion E may be hidden or displayed
– This “decision” may be influenced by
 Personality factors
 The interaction context
The emotional behavior is modeled by means of rules that regulate
activation of display goals [Carolis’02]
For example,
– This rule activates the goal of hiding fear felt at time T5
because the agent has an adoptive relationship with the user
– This rule activates the goal of showing, at move G7 (page 17), the
hope felt at time T7
22
Emotion Display
•
When an emotion has to be displayed
– An affective tag is automatically added to the agent’s move
Fine! This is the attitude I want to see from you! So, let’s
start, first of all, from cleansing your home of all binge
foods. Whatever your current food downfalls are, get rid
of them!
23
Character’s Action Rules
•
To support interfacing with cartoon-like characters (MS-Agents)
– We define the meaning-signal correspondence of a character
– In an XML Meaning-Signal translation file
 The rule of the form
•
Some examples with the MS-Agent Ozzar
24
An animation or speech feature
6. Dialog Simulation
•
Emotions have to be utilized
– To drive reasoning
– To regulate it
•
Simulating affective dialogs requires
– Modeling how emotional states influence the course of dialog
 Priority of communicative goals
 Dialog plans
 Surface realization of communicative acts
•
Dialog manager has to solve
– How should the agent behave?
 after discovering the emotional state of the user
 after feeling an emotional state of its own
– How should these emotions affect the dialog dynamics?
25
6. Dialog Simulation (Cont.)
•
The idea is
– Agent has an initial list of goals
 That she aims to achieve during the dialog
 With its own priority
 Some of these goals are inactive
– The agent knows
 How every goal may be achieved in a given context
26
6.1 Agent And User Models
•
An agent and a user model are stored
– With the interaction history
– In the information state of the dialog manager
•
These models include two categories of factors
– Long term settings
 Agent’s personality, its role, relationship with the user
 Stable during the dialog
 Influence the initial priorities of goals
 Plan initiative handling, and behavior
– Short-term settings
 Beliefs and emotional state of the agent
 Evolve during the dialog and influence goal priority change and
plan evolution
27
6.1 Agent And User Models (Cont.)
•
The agent’s goal gi can be linked by one of the following relations
•
Priority
– gi<gj: gi is more important than gj, gi will be achieved before gj.
•
Hierarchy
– H (gi, (gi, (gi1, gi2, … , gin, ))
– The complex goal gi may be decomposed into simpler subgoals gi1,
gi2, … , gin, which contribute to achieve it
•
Causal relation
– Cause(gi, gj), executing the plan achieving the source goal gi is a
precondition for executing the plan achieving the destination
28
6.2 Plans
•
•
•
•
29
Our dialog manager does not include a planner
Plans are represented as recipes that the agent can use to achieve its
goals
Our agent adopts the typical planning sequence of advisory systems
– Situation-assessment
– Describe-eating-disorders
– Suggest-solution
– Persuade-to follow suggestion
Default plan is outlined in the next page
The Discourse Plan In The Eating Disorders Domain
30
6.3 Reaction Rules
•
•
•
•
31
In the case of urgent events
– Reduce the detail of information
– Upgrade the priority of “most relevant” subgoals
– Downgrade the priority of details
When feeling altruistic social emotions
– Display them by verbal and non-verbal means
– Give them the highest priority
– Downgrade the priority of other goals
– Hide egoistic social emotions
When feeling positive emotions
– Express them with non-verbal means
– Leave the priority of other goals unvaried
When feeling negative emotions
– Activate behavior control goals
– Avoid displaying any emotional reaction by activating
– Repair goals
Effects Of Reaction Rules
•
32
Reaction rules may produce the following effects on the dynamics of
plan activation
– Add details
– Reduce details
– Abandon a plan temporarily and activate a new subplan
– Abandon a subplan
– Substitute a generic subplan with a more specific and situationadapted one
– Revise the sequencing of plans, to respond to the user request of
“taking the initiative”
7. Module Integration
•
•
•
•
33
Graphical interface
– Interacts with the user and activates the modules
– Enables user to follow the dialog both in natural language and with
the selected embodied agent
– Shows the agent’s emotional situation in graphical form
– Several agents have been linked to the system
 Various versions of Greta [Pelachaoud’02]
 Some MS-Agents
Users may set the simulation conditions
– Agent’s personality
– Its relationship with the user
– Character’s body
– Application domain
The dialog manager: TRINDIKIT
Emotion triggering module: HUGIN API
The Graphical Interface Of Our Emotional
Dialog Simulator
•
34
Two characters are displayed in the right frame
– Greta [Pelachaoud’02] and Ozzar (an MS-Agent)
The Information Exchanges Between Modules
•
•
•
•
35
Executable-Mind
– Receives information about the setting conditions
– Selects personality, context, and domain files
– Receives interpreted user moves
– Sends back a list of emotion intensities
Trindi
– Receives an interpreted user input and a list of activated emotions
– Generates an agent’s move which is displayed in natural language
in the left frame
Midas
– Produces an APML file
Animation engine
– Receives as input an APML file
– Using the meaning-signal translation file,
animates the selected character
8. Conclusions And Future Works
•
•
•
•
•
36
Our reaction rules are similar to social rules in Jack and Steve
Our personality traits enable the representation of a larger variety of
situations than McCrae
Assumption behind our emotion activation method is the same as in
Emile, although the formalism is not the same
The main limit of our prototype is in the asymmetry of the dialog
modality
– Not natural
– Need a refined speech recognizer that detect emotional states of
the users
Left questions
– Should multiple emotions be summed up into overall emotional
states?
– Should they be stored and treated separately?
– Should an agent always behave like a human?
– Should it be planned to dominate its emotions in a larger number of
circumstances?
– Are emotions always useful in producing appropriate decision
making?