Document 7129154

Download Report

Transcript Document 7129154

Designing and Evaluating
Life-like Agents
as Social Actors
Helmut Prendinger
Dept. of Information and Communication Eng.
Graduate School of Information Science and Technology
University of Tokyo
[email protected]
http://www.miv.t.u-tokyo.ac.jp/~helmut/helmut.html
Short Bio
education, experience

Master’s in Logic (1994)




Ph.D. in Artificial Intelligence (1998)



U. of Salzburg, Austria, Dept. of Logic and Philosophy of Science
Dynamic modal logic (completeness, decidability)
Non-degree studies in Psychology, Linguistics, Literature
U. of Salzburg, Dept. of Logic and Philosophy of Science and
Dept. of Computer Science; U. of California, Irvine
Incomplete reasoning (deduction, hypothetical reasoning, EBL)
Post doctoral research



U. of Tokyo, Ishizuka Lab
JSPS Fellowship (4/1998-3/2000): Knowledge compilation,
hypothetical reasoning
“Mirai Kaitaku” project (since 4/2000): Life-like characters,
affective communication with animated agents, markup languages
for animated agents, emotion recognition
Social Computing
main objective and task
Social Computing aims to support
the tendency of humans to interact with
computers as social actors.
Develop technology that reinforces human bias
towards social interaction by appropriate feedback
in order to improve the communication between
humans and computational devices.
Social Computing
realization
Most naturally,
social computing
can be realized
by using
life-like characters.
Life-like Characters at Work
sample applications
Tutoring, USC
Knowledge Sharing, ATR
Presentation,
U. of Tokyo
Sales, DFKI
Entertainment, MIT
Life-like Characters
desiderata

Life-like characters should be








emphatic and engaging as tutors
trustworthy as sales persona
entertaining and consistent as actors
stimulating as match-makers
convincing as presenters
(in short) … social actors
[… and competent ]
Life-like characters should enable

effective and natural communication with humans
Background
computers as social actors

Humans are biased to treat
computers like real people



Psychological studies show that
people tend to treat computers as
social actors (like other humans)
Tendency to be nicer in “face-toface” interactions, ...
Animated agents may support
this tendency if they are
designed as social actors
Ref.: B. Reeves and C. Nass, 1998. The Media Equation. Cambridge University Press, Cambridge.
Animated Agents as Social Actors
requirements for life-likeness
Features of Life-like Characters
Embodiment
 Synthetic bodies
 Emotional facial
display
 Communicative
gestures
 Posture
 Affective voice
Artificial
Emotional Mind
 Affect-based response
 Personality
 Response adjusted to
social context
 social role awareness
 Adaptive behavior
 social intelligence
Outline
designing and evaluating life-like characters




The mind of life-like agents
Emotion, social role awareness, attitude change
Demo - Casino scenario
Implementation and character behavior scripting

Evaluating life-like characters
Using biosignals to detect user emotions
Experimental study with character-based quiz game

Book project - character scripting languages and applications


SCREAM System Architecture
SCRipting Emotion-based Agent Minds
Appraisal Module
the cognitive structure of emotions


Evaluates external
events according to
their emotional
significance for the
agent
Outputs emotions





joy, distress
happy for, sorry for
angry at
resent, gloat
… 22 in total
Ref.: A. Ortony, G. Clore, A. Collins, 1988. The Cognitive Structure of Emotions. Cambridge University
Press, Cambridge.
Social Filter Module
emotion expression modulating factors

Ekman and Friesen’s
facial “Display Rules”
(’69)


Expression and
intensity of emotions is
governed by social and
cultural norms
Brown and Levinson
(’87) on linguistic style

Linguistic style is
determined by social
variables: power,
distance, imposition of
speech acts
Agent Model
character profile, affect processing

Character Profile


Static features


personality traits, standards
Dynamic features


static and dynamic features
goals, beliefs, attitudes
Attitudes (liking/disliking) are an important source of
emotions toward other agents

an agent’s attitude decides whether it has a positive or
negative emotion (toward another agent)


“happy for”– resent; “sorry for”– gloat
an agent’s attitude changes as a result of communication

dependent on “affective interaction history”
Signed Summary Record
computing attitude from affective interaction history
distress (1)
distress (3)
angry at (2)
hope (2)
good mood(1)
winning
emotional
states
interaction history
joy (2)
gloat (1)
happy for (2)
time
Attitude
summary
value
positive
emotions
negative
emotions
joy (2)
distress (1)
hope (2)
distress (3)
good mood(1)
angry at (2)
happy for (2)
gloat (1)
= +


Liking if positive
Disliking if negative
<emotion,
intensity>
pairs
Ref.: A. Ortony, 1991. Value and emotion. In: W. Kessen, A. Ortony, and F. Craik (eds.), Memories,
Thoughts, and emotions: Essays in the honor of George Mandler. Hillsdale, NJ: Erlbaum, 337-353.
Updating Attitude
weighted update rule
If a high-intensity emotion of opposite sign occurs – e.g., a liked

interlocutor makes the agent very angry

Agent ignores “inconsistent” new information

Agent updates summary value by giving greater weight to
“inconsistent” information (“primacy of recency”, Anderson ’65)
(Sitn) =
3
= (3
disliking

(Sit
liking
w: intensity of
 
(winning) emotion
n1)  h + w (Sitn)  r
 0.25)  (5
h-weight angry
,   {+,}
historical/recency
 0.75) h/r: weight
r-weight
Consequence for future interaction with interlocutor


Momentary disliking: new value is active for current situation
Essential disliking: new value replaces summary record
Life-like Agents
making them act and speak

Realization of embodiment



Technology



Microsoft Agent package (installed client-side)
JavaScript based interface in Internet Explorer
Microsoft Agent package




2D animation sequences
Synthetic affective speech
Controls to trigger character actions
Text-to-Speech (TTS) Engine
Voice recognition
Multi-modal Presentation Markup Language (MPML)


Easy-to-use XML-style authoring tool
Interface with SCREAM system
Life-like Characters in Interaction
some demos
Casino
Scenario
Life-like characters
that change their
attitude during
interaction
Comics
Scenario
Animated comics
actors engaging in
developing social
relationships
Business
Scenario
Animated agents
that storify tacit
corporate
knowledge
Casino Scenario
life-like characters with changing attitude

Animated advisor (“Genie”)



Dealer (“James”), player (“Al”)


User in the role of player
of Black Jack game
Emotion, personality
Changes attitude dependent
on interaction history with
user
Pre-scripted behavior
Genie‘s Character Profile
% Personality specification
personality_type(genie,agreeableness,3).
personality_type(genie,extraversion,2).
% Social variables specification
social_power(genie,user,0,_).
social_distance(genie,user,1,_).
% Goals
wants(genie,user_wins_game,1,_).
wants(genie,user_follows_advice,4,_).
% Attitude
Implemented
with MPML and SCREAM
attitude(genie,user,likes,1,init).
Emotional Arc
advisor’s dominant emotions depending on attitude
Round 1
Round 2
Round 3
Round 4
Round 5
advisor has agreeable personality
pos. attitude
pos. attitude
neg. attitude
pos. attitude
pos. attitude
ignores advice
ignores advice
ignores advice
follows advice
ignores advice
user looses
user looses
user looses
user looses
user wins
distress (4)
sorry for (4)
gloat (5)
sorry for (5)
good mood (5)
Internal intensity values
advisor has agreeable personality, is socially slightly distant to user
distress (1)
sorry for (5)
gloat (2)
Intensity values of expressed emotions
sorry for (5)
good mood (5)
Implementation
Agent Scripting
simple MPML script
<!--Example MPML script -->
<mpml>
…
<scene id=“introduction” agents=“james,al,spaceboy”>
<seq>
<speak agent=“james”>Do you guys want to play Black Jack?</speak>
<speak agent=“al”>Sure.</speak>
<speak agent=“spaceboy”>I will join too.</speak>
<par>
<speak agent=“al”>Ready? You got enough coupons?</speak>
<act agent=“spaceboy” act=“applause”/>
</par>
</seq>
</scene>
…
</mpml>
Mind-Body Interface
interface SCREAM MPML
<!--MPML script showing interface with SCREAM -->
<mpml>
…
<consult target=”[…].jamesApplet.askResponseComAct(‘james,’al’,’5’)”>
<test value=“response25”>
<act agent=“james” act=“pleased”/>
<speak agent=“james”>I am so happy to hear that.</speak>
</test>
<test value=“response26”>
<act agent=“james” act=“decline”/>
<speak agent=“james”>We can talk about that another time.</speak>
</test>
…
</consult>
…
</mpml>
Alternative View
smart characters vs. smart environments
infers
“I am happy”
environment instructs agent
“be happy now”
“acts”
expresses
happiness
“perceives”
game state

“Sense-think-act” cycle



Classical AI approach
Internet softbots search for
information on the web, robots
explore their environment
All the intelligence is agent-side
“tells”
available
behaviors

behavior
repository
“Annotated” environments



Shift from agent intelligence to
environment intelligence
Semantic web, ubiquitous
computing, affordance theory
Agents and environments can
be developed independently
Outline revisited
designing and evaluating life-like characters




The mind of life-like agents
Emotion, social role awareness, attitude change
Demo - Casino scenario
Implementation and character behavior scripting

Evaluating life-like characters
Using biosignals to detect user emotions
Experimental study with character-based quiz game

Book project - character scripting languages and applications


Affective Computing
why should a computer recognize user emotions?

Human-human communication



Based on efficient grounding mechanisms
including the ability to recognize
interlocutors’ emotions (frustration,
confusion,…)
Humans may react appropriately upon
detection of an interlocutor’s emotion
(clarification upon confusion)
Human-computer communication



Computers typically lack ability to
recognize user emotions
Ignoring users’ emotions causes users’
frustration
Recognizing and responding to users’
(often) negative emotions may improve
users’ interaction experience
Ref.: R. Picard, 1997. Affective Computing. The MIT Press.
Emotion Recognition
how can computers recognize users’ emotions?

Stereotypes


Communicative modalities






A typical visitor of a casino wants… (to win)
Facial display (face recognition)
Prosody (speech analysis)
Linguistic style (NLU)
Gestures (gesture recognition)
Posture (posture recognition)
Physiological data

Biosignals
Physiological Data Assessment
ProComp+ unit
BVP







EMG: Electromyography
EEG: Electroencephalography
EKG: Electrocardiography
BVP: Blood Volume Pressure
GSR: Galvanic Skin Response
Respiration
Temperature
sensors
GSR
Inferring Emotions from Biosignals
Lang’s 2-dimensional emotion model
enraged
excited

Lang’s two dimensions

joyful


Arousal
Biometric measures

sad
relaxed
depressed
Valence
Valence - positive or negative
dimension of feeling
Arousal - degree of intensity
of emotional response


Skin conductivity increases
with arousal (Picard ’97)
Heart rate increases with
negatively valenced emotions
Note

some named emotions in the
arousal-valence space
introverts reach a higher level
of emotional arousal than
extroverts
Ref.: Lang, P. 1995. The emotion probe: Studies of motivation and attention.
American Psychologist 50(5):372–385.
Experimental Study
effects of a character-based interface


Biosignals to measure skin conductance and blood volume
pressure (`objective’ assessment of user experience)
Questionnaire (users’ subjective assessment)
Instruction




Show that a character with affective expression may improve
users’ experience (= reduce frustration) of a simple quiz game
Method


Experimenter
Analyser
Aim of study


Junichiro Mori -
Addition/subtraction task (short-term memory load)
Solve a series of 30 quizzes correctly and as fast as possible
Frustration is deliberately caused by delay (in 6 out 30 quizzes)
Subjects


20 university students (all male Japanese, approx. 24 years old)
JPY 1000.- for participation, JPY 5000.- for best score
Experimental Setup
Instruction
mathematical quiz game
timer
It is correct.
(polite language)
sometimes delay
here (6 – 14 sec.)




Add 5 numbers and subtract the i-th
number (i < 5)
1 + 3 + 8 + 5 + 4 = [21]
E.g.: subtract the 2nd number
Result: 18


Select the correct answer by clicking
the radio button next to the number
Then the character tells whether
answer is correct
Two Versions of the Game
affective vs. non-affective (independent variables)
Affective Version
Non-Affective
Version
Description
Character expresses
happiness (sorriness) for
correct (wrong) answer
 Character shows empathy
(when delay occurs)
 Character expresses affect
both verbal and nonverbal

Character does not show
affective response
 Character ignores
occurrence of delay

Hypotheses
Character may reduce user
stress (SC) and decrease
negative valence (heart rate)

Character has no significant
effect on user emotion (SC,
heart rate)

Character Responses
examples of affective/non-affective feedback
I am sorry. It is wrong.
(hyper-polite language)
I am sorry for the delay.
(polite language)
Hanging shoulder gesture to
express sorriness non-verbally
Character apologizes for the
delay
Non-affective feedback
Non-affective feedback
“Wrong.” No non-verbal
emotion expression.
Character ignores the occurrence
of delay.
Analyzing Physiological User Data
BVP
user
response
DELAY
segment
agent
response
BVP
could not
be taken
reliably
RESPONSE
segment
GSR
delay
starts
delay
ends
Biograph
Software
(Thought
Technologies)
Preliminary Findings
9 subjects in each version (data of 2 subjects discarded)

Hypothesis (design): delay induces frustration in subjects



All 18 subjects showed significant rise of SC in DELAY segment
Corresponds to finding in behavioral psychology (if an individual is prohibited
from attaining a goal, the individual experiences primary frustration)
Hypothesis (main): affective agent behavior reduces user frustration
DELAY
segment
Non-affective version: mean = 0.05
Affective version:
mean = 0.2
RESPONSE
segment
t-test (assuming unequal variance)
t(16)=2.57; p = .01
mean values sf SC
(BVP could not be taken reliably)
Preliminary evaluation suggests that an animated character expressing
emotions and empathy may undo some of the user’s frustration.
Agents Adapting to User Emotion
assumes real-time recognition of user emotions
user’s
action
user model
evidence ti
node
user’s
traits
Dynamic
Decision
Network
(simplified)
ti+1
agent’s
actions
user model
user’s
traits
learning
emotional state
emotional state
bodily
expressions
sensors
learning
bodily
expressions
evidence
nodes
sensors
U
QUESTION:
Given user’s state at ti,
which agent action will
maximize agent’s
expected utility at ti+1,
in terms of, e.g., user’s
learning and emotion?
Dynamics of User Emotions
ti
user personality
user goals
ti+1
agreeableness
succeed by
myself
provide help
extraversion
agent’s
action
have fun
reproach
user’s emotional
state at ti+1
reproach
neg valence
pos valence
joy
Ref.: Conati, C. 2002.
Probabilistic assessment of user’s
emotions in educational games.
Applied Artificial Intelligence
16(7-8):555–575.
joy
shame
shame
user’s emotional
state at ti
do nothing
arousal
bodily
expressions
eyebrows
position
skin
conductivity
heart rate
sensors
vision based
recognizer
EMG
down(frowning)
GSR
high
BVP
high
Outline revisited
designing and evaluating life-like characters




The mind of life-like agents
Emotion, social role awareness, attitude change
Demo - Casino scenario
Implementation and character behavior scripting

Evaluating life-like characters
Using biosignals to detect user emotions
Experimental study with character-based quiz game

Book project - character scripting languages and applications


Book Project
character scripting languages and applications

Wide dissemination of life-like
character technology requires


Book will offer state-of-the-art on XMLbased markup languages and tools


standardized ways to represent the
behavior of agents
Scripting languages for face animation, H. Prendinger, M. Ishizuka (Eds.)
body animation and gestures, emotion Life-like Characters. Tools, Affective
Functions and Applications
expression, synthetic speech,
Springer Hardcover
interaction with environment,…
(in preparation)
Characters are already used in a wide
variety of applications


Book contains some of the most
successful character-based
applications
Synopsis chapters on character design
useful as
Standard/Reference Book
State-of-the-Art in Life-like Agents
Course Book
for HCI, HAI, multimedia, life-like agent
applications, scripting languages,…
Conclusion

Social Computing


Designing life-like characters as social actors





Human-computer interaction as social interaction
Believability-enhancing agent features
Emotion, personality, social role awareness, attitude
change, familarity change
Casino demo
Future avenues – “smart” environments (character &
annotated environments)
Evaluating life-like characters as social actors



Experimental study using user’s biosignals
Life-like characters’ affective response may undo some
of the user’s negative feeling
Future avenues – real-time adaptivity of agent
behavior to user’s emotion, decision-theoretic
approach to agent behavior