Intent Recognition as a Basis for Imitation Learning in Humanoid Robots Andrew Fagg, Rod Grupen, Mike Rosenstein, and John Sweeney UMass Amherst NEMS 2005 LABORATORY FOR PERCEPTUAL.

Download Report

Transcript Intent Recognition as a Basis for Imitation Learning in Humanoid Robots Andrew Fagg, Rod Grupen, Mike Rosenstein, and John Sweeney UMass Amherst NEMS 2005 LABORATORY FOR PERCEPTUAL.

Intent Recognition as a Basis
for Imitation Learning
in Humanoid Robots
Andrew Fagg, Rod Grupen, Mike
Rosenstein, and John Sweeney
UMass Amherst
NEMS 2005
LABORATORY FOR PERCEPTUAL ROBOTICS • UNIVERSITY OF MASSACHUSETTS AMHERST • DEPARTMENT OF COMPUTER SCIENCE
Programming a Humanoid is
Hard
• Complex mechanisms: many DOF, many sensor
streams.
• Programming by demonstration:
– Demonstrator performs task, robot extracts salient
knowledge to reproduce across many instances.
• Notables: [Pook & Ballard 93], [Kuniyoshi et al. 94], [Voyles
et al. 99], and [Ijspeert et al. 02]
• Imitation:
– Imitation learning augments stochastic exploration for
acquiring control knowledge.
LABORATORY FOR PERCEPTUAL ROBOTICS • UNIVERSITY OF MASSACHUSETTS AMHERST • DEPARTMENT OF COMPUTER SCIENCE
On Imitation
• Assume that demonstrator is performing a
goal-directed behavior.
• Kinematic properties of demonstration are
not important to us.
– Can refine using robot specific objectives.
• Interested in the work conveyed by
demonstration.
– How the objects are manipulated, and what
sequence, etc.
LABORATORY FOR PERCEPTUAL ROBOTICS • UNIVERSITY OF MASSACHUSETTS AMHERST • DEPARTMENT OF COMPUTER SCIENCE
Why Intention?
• Infer the goal and recognize the scene,
and the behavior can be successfully
reproduced.
– We have some domain specific knowledge.
• Intention is compatible across
morphologies.
• Recognize more from less.
– More abstract representations of actions.
LABORATORY FOR PERCEPTUAL ROBOTICS • UNIVERSITY OF MASSACHUSETTS AMHERST • DEPARTMENT OF COMPUTER SCIENCE
Demonstration by Teleoperation
• Direct access to joint velocities and tactile
information.
• No difficulty with correspondence.
• Difficulty of teleoperation:
– Minimal feedback, communication delays.
– Fatigue.
– Discrepancy between human and robot
observational frames.
LABORATORY FOR PERCEPTUAL ROBOTICS • UNIVERSITY OF MASSACHUSETTS AMHERST • DEPARTMENT OF COMPUTER SCIENCE
A Robot Teleoperation Interface
NASA/JSC Telepresence System
LABORATORY FOR PERCEPTUAL ROBOTICS • UNIVERSITY OF MASSACHUSETTS AMHERST • DEPARTMENT OF COMPUTER SCIENCE
Video
QuickTime™ and a
Video decompressor
are needed to see this picture.
LABORATORY FOR PERCEPTUAL ROBOTICS • UNIVERSITY OF MASSACHUSETTS AMHERST • DEPARTMENT OF COMPUTER SCIENCE
How to infer intent?
LABORATORY FOR PERCEPTUAL ROBOTICS • UNIVERSITY OF MASSACHUSETTS AMHERST • DEPARTMENT OF COMPUTER SCIENCE
Mirror Neurons
What this suggests:
Action generation and perception[Rizzolatti et al. 01]
are initimately related: use controller
as sensor!
LABORATORY FOR PERCEPTUAL ROBOTICS • UNIVERSITY OF MASSACHUSETTS AMHERST • DEPARTMENT OF COMPUTER SCIENCE
Controller as Sensor
• Set of controllers defined by objects in the
scene.
• Compare what demonstrator does to what
each controller would do.
– “Control Projection”
• Use domain knowledge to inform when
meaningful events occur.
– Pay attention to tactile events!
LABORATORY FOR PERCEPTUAL ROBOTICS • UNIVERSITY OF MASSACHUSETTS AMHERST • DEPARTMENT OF COMPUTER SCIENCE
Set of Primitive Controllers
• Each controller represents one identified
affordance in the scene.
– Domain: objects on a table in front of the robot: cans,
beanbags, and targets.
AFFORDANCE
A functional matching between object and
actor; described by particular perceptual
features. [Gibson 77]
LABORATORY FOR PERCEPTUAL ROBOTICS • UNIVERSITY OF MASSACHUSETTS AMHERST • DEPARTMENT OF COMPUTER SCIENCE
The Robot’s View
• Recognized Affordances:
– Type of grasp: top, side
– DOF constraints: don’t care about rot. about Z
– Every object has a previous place
LABORATORY FOR PERCEPTUAL ROBOTICS • UNIVERSITY OF MASSACHUSETTS AMHERST • DEPARTMENT OF COMPUTER SCIENCE
From Scene to Controllers
Object Models
Controllers
LABORATORY FOR PERCEPTUAL ROBOTICS • UNIVERSITY OF MASSACHUSETTS AMHERST • DEPARTMENT OF COMPUTER SCIENCE
Extracting a Sequence of
Actions
• Set of controllers represent hypotheses of
intention.
• Observable variables: controller errors, force
magnitude at fingertips.
• Controller i explains sequence of observations if:
– Each step in sequence reduces error
– Error at end is small
– Finishes with tactile event
• Use Bayesian inference to infer most likely.
LABORATORY FOR PERCEPTUAL ROBOTICS • UNIVERSITY OF MASSACHUSETTS AMHERST • DEPARTMENT OF COMPUTER SCIENCE
Controller Primitives
• Define Cartesian controller i:
Reference
Error
Joint Command
LABORATORY FOR PERCEPTUAL ROBOTICS • UNIVERSITY OF MASSACHUSETTS AMHERST • DEPARTMENT OF COMPUTER SCIENCE
Determining Likelihood
• Error given by distance between joint
commands:
• Compute likelihood at time t:
LABORATORY FOR PERCEPTUAL ROBOTICS • UNIVERSITY OF MASSACHUSETTS AMHERST • DEPARTMENT OF COMPUTER SCIENCE
An Extracted Sequence
LABORATORY FOR PERCEPTUAL ROBOTICS • UNIVERSITY OF MASSACHUSETTS AMHERST • DEPARTMENT OF COMPUTER SCIENCE
An Extracted Sequence
LABORATORY FOR PERCEPTUAL ROBOTICS • UNIVERSITY OF MASSACHUSETTS AMHERST • DEPARTMENT OF COMPUTER SCIENCE
Playing Back the Sequence
• Most likely controller at tactile event is
recorded.
• Extracted sequence makes reference to
affordances in relation to specific objects.
– Can rearrange scene: just find
correspondence between objects.
– Simple visual models used.
LABORATORY FOR PERCEPTUAL ROBOTICS • UNIVERSITY OF MASSACHUSETTS AMHERST • DEPARTMENT OF COMPUTER SCIENCE
Further
• More elaborate model of activity
– Look at controller error and change in error.
• More elaborate representations of task
– Hierarchical
– Example: using a tool, building a structure.
• Identify affordances from interaction.
– Find visual features that predict affordances.
– Categorization
• Relational models describe object interaction
– How objects can interact depend on identity.
LABORATORY FOR PERCEPTUAL ROBOTICS • UNIVERSITY OF MASSACHUSETTS AMHERST • DEPARTMENT OF COMPUTER SCIENCE
LABORATORY FOR PERCEPTUAL ROBOTICS • UNIVERSITY OF MASSACHUSETTS AMHERST • DEPARTMENT OF COMPUTER SCIENCE
Related Work
• Teleoperation activity recognition [Pook &
Ballard 93]
• Block stacking imitation [Kuniyoshi et al.
94]
• Gesture-Based Programming [Voyles et al.
99]
• Movement Imitation [Ijspeert et al. 02]
LABORATORY FOR PERCEPTUAL ROBOTICS • UNIVERSITY OF MASSACHUSETTS AMHERST • DEPARTMENT OF COMPUTER SCIENCE
• Mirror Neurons
– Area of ventral premotor cortex in primates
disocovered by Rizzolatti et al. [1996] that fire
when monkey performs grasps and observes
others perform grasp.
LABORATORY FOR PERCEPTUAL ROBOTICS • UNIVERSITY OF MASSACHUSETTS AMHERST • DEPARTMENT OF COMPUTER SCIENCE