Welcome [www.gametechconference.com]

Download Report

Transcript Welcome [www.gametechconference.com]

Situated Tutors
Tutorial
Sae Schatz
MESH Solutions, LLC – a DSCI Company
Schedule
• Part 1: Background
• Part 2: Theory
• Part 3: Technical Details
• Part 4: Use Case
• Part 5: Recommendations
Part 1: Background
ITS Effectiveness
of Students
Number of
Learners
Number
Average
(Human) Tutor
Classroom
Learning
Top 2%
Instructional Outcomes: Knowledge, Performance, etc.
Performance
Bloom, B. S. (1984).
The Challenge
Effective
Efficient
For higher-order cognitive skills
For training declarative/procedural skills
One-on-One
(Human)
Tutoring
Apprentice
Learning
Didactic
Lecture to a
Group
ComputerBased
Training
Currently, higher-order skills training is effective or efficient
Situated Tutors
Intelligent
Tutors
Instructional
Simulations
Static Computer-Based Learning
Same for everyone
Intelligent Tutors
Different for different students
ITS Effectiveness
Number of Learners
Average
(Human) Tutor
Classroom
Learning
Top 2%
Instructional Outcomes: Knowledge, Performance, etc.
SHERLOCK (1988)
Andes (2005)
Ecolab (1999)
ITS Effectiveness & Efficiency
 Typical effectiveness gains of ITSs:
—
—
—
0.48–0.61σ (Dede, 2008)
1.0σ (Lane, 2006)
LISP tutor = 48% improvement on posttest (Anderson, 1990)*
 Typical efficiency gains of ITSs:
—
—
—
—
One third of the time vs. classroom (Lajoie & Lesgold, 1992)
4σ efficiency gain over traditional CBT (Romero et al., 2006)
Air Force electronics tutor for 20hr = 48 months of OJT (Lesgold et al., 1990)
LISP tutor = 30% less time vs. classroom (Anderson, 1990) *same study as above
Pros and Cons
Intelligent Tutors Pros
Intelligent Tutors Cons
 Adaptive
 Lacks Intrinsic Feedback
 Manpower Efficiency
 Usually Declarative/Procedural
 Embedded Pedagogy/Andragogy
 Usually More Defined Domains
 Usually Single-user
Situated Tutors
Intelligent
Tutors
Instructional
Simulations
Simulation-Based Training
 Instructional simulations include those simulations that employ a
systematic instructional methodology (scenario-based training,
for our purposes), as well as accurately represent the problemsolving domain (Salas, Bowers, & Rhodenizer, 1998; Oser et al.,
1997).
—
Average performance gains of SBT vs. classroom:
• 72% fewer errors in practice with SBT (Haque & Srinivasan, 2006) meta-analysis
—
Average efficiency gains of SBT:
• 84% less time vs. traditional (Haque & Srinivasan, 2006) meta-analysis
But, in practice, SBT often falls short…
Low Efficiency
Heavy instructor workload
Instructors must be SMEs,
instructional designers, and technologists
Deployed systems may have no instructional staff
(e.g., McCarthy, 2008; Loftin et al.,
2004; Smith-Jentsch et al., 1998).
Low Effectiveness
If instructors cannot meet all requirements or
cope with the workload, suboptimal training may
result involved
This may result in negative training
(e.g., Loftin et al., 2004; NRC, 1985; Houck & Thomas, 1991;
Andradóttir et al., 1997; Air Force, 1991; Wray et al., 2004)
Pros and Cons
Simulation Pros
Simulation Cons
 Good Transfer of Training
 One-Size-Fits(?) All
 Often Supports Team Training
 Relies on Instructor for Pedagogy
 Supports Complex Contexts
 Relies on Instructor for Sequencing
 Heavy Instructor Workload
 No Good Instructor = Poor Training
Situated Tutors
Intelligent
Tutors
Instructional
Simulations
Situated Tutors
+
INTELLIGENT
TUTOR
=
SIMULATIONBASED LEARNING
SITUATED
TUTOR
Situated tutors are a special class of Intelligent Tutoring Systems that combine
the features of an intelligent tutor with the scenario-based situated learning
environment of instructional simulations.
Situated Tutors
+
INTELLIGENT
TUTOR
=
SIMULATIONBASED LEARNING
Automation
SITUATED
TUTOR
Situated learning context
Adaptation
Supports higher-order cognitive skills
Careful operationalization of domain
Facilitate less determinate domains
Includes instructional support
Extrinsic feedback
Intrinsic feedback
Team training
Part 2: Theory
Situated Tutors
Situated tutors are computer-based instructional technologies
that, at a minimum, include a simulated learning or training
environment of Interactive Multimedia Instruction (IMI) Level 3 or
above and instruct with intelligent adaptation. Further, these
features are, at least, loosely federated with each other.
Simulation Depth
IMI LEVEL 1
Page turner: does not
include any simulationlike features
Example: A basic
website, like the Red
Cross’s Preparing for
Events
IMI LEVEL 2
Medium simulation:
supports limited
interactivity, such as
asking and scoring a
response to a question
Example: Interactive
courseware or website
scripting, like this quiz
from Discovery.com
IMI LEVEL 3
High simulation: Surface
simulation with 2-3
levels of complex
branching
Example: A highly
interactive simulation,
such as Dafur is Dying, a
robust serious game
made in Flash
IMI LEVEL 4
Full simulation: rich
interactivity and
branching, extensive
high-fidelity surface
simulation capabilities
Example: A video game,
such as America’s Army
Does the system offer sufficient psychological fidelity and freedom of action to support
the training of higher-order cognitive skills?
Sophistication of Adaptation
ADAPTATION AS
PREFERENCE
Learner choice:
Allows learner to
control nature of
interactions—
generally
diminishes
outcomes
Example: Selfsought instruction,
like this flash game
ROLE
ADAPTATION
Categorical: Broad
learner-selected
categories, such as
by MOS, often
distinguishes
content presented
Example: Many
training websites,
such as the GPRIME
medical trainer
MACRO
ADAPTATION
MICRO
ADAPTATION
Tailored Pretraining: Individual
learner KSAs and
traits affect pre-task
adaptation
Tailored Duringtraining: Tailored
intervention is
triggered based on
during-task actions
Example: Often
found in CBT
systems; supports
sequencing and ATI,
e.g., some LMSs
Example: Found in
conventional ITSs;
supports immediate
feedback; e.g., PAT
intelligent tutor
ACTIVE
ADAPTATION
Overall:
Combination of
effective macroand microadaptations
Example: Facilitates
immediate
feedback and longrange sequencing;
e.g., Rosetta Stone
Does the system support tailored pre-task adaptation (e.g., instructional sequencing)
and during-task adaptation (e.g., personalized hinting and feedback).
Situated Tutors: Tasks, Conditions, and Standards
Degree of Component Integration
NO FEDERATION
Separated: No data are passed
between the simulation and ITS
components. Generally, an ITS
lesson is delivered and then the
student is told to use the
simulation
Example: Microsoft Flight Simulator
training web site
LOOSELY FEDERATED
Side-by-Side: The ITS and
simulation exchange only outcome
data. The two systems are often
physically separated
Example: Many military situated
tutors follow this model; protocols
such as IPA, DTECS, and SITA
facilitate this integration; e.g.,
FBCB2/Tactical Decision-Making ITS
TIGHTLY FEDERATED
Full Integration: The ITS and
simulation components can
exchange data constantly; ITS
features often “overlay” the
simulation
Example: The most sophisticated
military systems, such as PORTS
TAO ITS; I/SIS protocol can be used
(but is rarely applied)
Does the system support simultaneous functioning and robust
data interchange between the ITS and SBT components?
Find & Classify Situated Tutors
Detailed literature
review of 86
situated tutors
Definitions from: Schatz, S., Oakes,
C., Folsom-Kovarik, J. T., & DolletskiLazar, R. (2012). ITS + SBT: A Review
of Operational Situated Tutors. Military
Psychology.
Situated Tutor Development Timeline
30
New Situated Tutors Introduced
25
20
15
10
5
0
Through 1990
1991-1995
1996-2000
2001-2005
Year of Introduction
2006-Present
Situated Tutors by Domain
USAF
US Army
US Navy/USMC
US Joint/Coalition
Education
Medical
Energy
Transportation (Incl. Space)
Law Enforcement
Manufacturing
Other (Incl. Basic Research)
0
5
10
15
20
25
Military Acquisition Timeline
10
USAF
US Army
USN
US joint military
non-US military
9
New Situated Tutors Introduced
8
7
6
5
4
3
2
1
0
Through 1990
1991-1995
1996-2000
Year of Introduction
2001-2005
2006-Present
Situated Tutors Effectiveness
59%
87%
Better than the
video only
Improved with
simulation + ITS
Ablative test: 56% in the
video-only condition, 67%
in the no-coach condition,
and 89% of in the coach
condition were successful
(Lane et al., 2008).
87% of police officers
explained additional crimes
with both the ITS and sim.
vs. simulation alone
(Furtado & Vasconcelos)
ELECT BiLAT (aka VCAT)
ExpertCop
(USC Institute for Creative Technologies)
(Furtado & Vasconcelos, 2006)
Situated Tutors Efficiency
Over 2000%
98%
More Efficient
Per Class
Cost Saving Versus
F2F Approach
Previously, one instructor
needed for two students,
for a class of 42; now one
instructor manages whole
class (Stottler & Panichas
2006)
AFRL’s IATS reduced costs
from $1172 a seat/year to
$28 a seat/year for
shipboard maintenance
training (Madni, 2010).
TAO ITS
IATS
(Stottler Henke)
(Madni, 2010)
Part 3:
Technical Details
Traditional
Intelligent
Tutor
Domain
Model
Pedagogical
Model
Learner
Model
Domain
Content
Instructional
Methods
Learner
Data
Situated
Tutor
Game Engine
Domain
Model
Pedagogical
Model
Learner
Model
Domain
Content
Instructional
Methods
Learner
Data
Historic Inputs:
•
•
•
•
•
Prior Knowledge
General Aptitude
Constitutional Attributes
Affective Attributes
Learner Preferences
Immediate Inputs:
• Current Performance
• System Use/Abuse
• Affective State
(e.g., boredom, confusion,
delight, flow, and
frustration)
Macro-Adaptation:
•
•
•
•
•
Content Selection
Preset Hints/Coaching
Preset Teaching Approach
Scaffold Challenge Level
Preset Scenario Variables
Micro-Adaptation:
•
•
•
•
•
Give Hints/Coaching
Change Teaching Approach
Change Challenge Level
Adjust Scenario Story
Give Intrinsic Feedback
ITS Learner Model Varieties
Overlay Models
Classifier
Models
ConstraintBased Models
Example Tracing
Models
Perturbation
Models
(or “Pseudotutors”)
(or “Buggy Models”)
Production Rule
Models
Bayesian Networks
Model Tracing
Systems
Dynamic Bayesian
Networks
ACT-R Models or
“Cognitive Tutors”
Behavior Transition
Networks
Case Libraries
Decision Trees
Finite-State
Automata
Neural Networks
Overlay
Uses
model
models
tracing
ignore
algorithm
details
of(i.e.,
how
rules
students
drawn
learn
from
a
Monitor
the
immediate
problem
state.
As long
as aand
Authors
define
incorrect
responses
for
single
questions,
Perturbation models or buggy models, try to describe all
instead
general
track
model
what
students
ofand
human
have
cognition),
learned
hence
in aefficient
simple
called
way,
learner
never
reaches
a state
that
model
identifies
as
and
they
are
less
concerned
with
the cognitive
theories,
Bayesian
networks
other
classifiers
are
but
the incorrect knowledge the learner may have. Can
“cognitive
similar
tutors.”
ahechecklist.
Slow
Slow
to systems
develop,
to develop
but action.
have
and
moderately
high
returns.
wrong,
ordetail
she
may
perform
any
Highly
hence
example-tracing
were
called
pseudohaveto
lower
than
some
other
model
types.
require extensive investment and have mixed results.
effective for
macro-adaptation.
effective.
intelligent
tutors
or pseudotutors.
… and Others …
Less
Detailed
More
Detailed
Lowest reported
Highest reported effect
development to learning
on learning
time ratio
Macro- and microadaptation
Overlay models
24:1
1.02, compared to onthe-job training
Macro
Bayesian networks and
other classifiers
30:1
0.7, compared to work
without feedback
Both
Constraint-based
models
220:1
1.3, compared to work
without feedback
Micro
Example tracing
18:1
0.75, compared to
paper homework
Micro
Perturbation and
buggy models
133:1
Not significant
Both
Production rules and
model tracing
200:1
1.2, compared to
classroom learning
Micro
Folsom-Kovarik, J. T. & Schatz, S. (2011).
Part 4: Real Example
37
Process
—
—
—
—
—
—
—
—
—
Review of literature
Interviews with SMEs, stakeholders
Concept designs for team
Learning objectives Dynamic tailoring requirements
GOTS/COTS Trade-off analysis
Hardware/software feasibility/cost analysis
Iterative requirements authoring
Iterative development
Iterative testing
Baselining
Concept Creation
PercepTS Virtual Ville Concept
Virtual Ville
Virtual
Ville
Spatial
Layout
Multiple
OPs (or
Combat
Outposts)
OT / Control
Room
TOC
AAR &
Vicarious
Learning
Room
Concept
Creation:
Conceptual
Architecture
INPUT/OUTPUT
Optical
Interface
Devices
Positional
Tracking
System
Radio
Interface
Device
Optical
System
Visualization
Visualization
System
SYNTHETIC TRAINING
ENVIRONMENT
Radio
Interface
Controllers
Additional
Simulation
Plug-Ins
Simulation
Environment
SAF
Behavior
Authoring
Interface
(Torque)
Dynamic
Tailoring
Toolkit
Virtual
Team
Speech
Generation
NPC
Controller
(SAF Behaviors)
Training
Content
Authoring
Interface
Scenario and
Lesson Toolkit
Metrics
Authoring
Toolkit
Domain
Module
Speech
Recognition
Assessment
Module
AAR
Module
(Micro/Macro)
Dynamic
Tailoring
Module
Observer
Trainer
Terminal
Trainee
Module
AUTHORING AND
MANAGEMENT
Virtual
Environment
Database
Scenario
Content
Database
Patterns of
Life Database
Dynamic
Tailoring
Database
Trainee
Records
Database
INSTRUCTIONAL AND EXPERT KNOWLEDGE DATABASES
Detailed
Learning
Objectives
Requirements
Trade Off Analysis
Part 5:
Recommendations
#1: Reporting Situated Tutor Development
• Report systems’
• (a) interactivity (including IMI Levels)
• (b) forms of adaptation
• (c) integration of features
#2: Report Situated Tutor Evaluation Results
• Empirically assess situated tutors’ effectiveness
and efficiency
• Use ablative conditions, not just versus classroom
• Remark on real-world impacts (e.g., reduced cost per seat,
increased readiness reports)
#3: Expand Intrinsic Adaptation
• Investigate novel situated tutor methods
• Carefully assess adaptations impacts
• Document categorical types of intrinsic adaptation, their best
uses, and potential pitfalls to avoid
• Consider macro-adaptive approaches, such as dynamic
scenario generation, too
#4: Expand Higher-Order Instruction
• Emphasize sophisticated cognitive, affective, and
psychosocial competencies
• Examine instructional strategies—specifically for situated
tutors—that engender higher-order skills
#5: Embrace “Instructional Fidelity”
• Need to focus on the development of expert mental models
• Need to move beyond just operationally-situated practice
• Need to design instructional experiences for developing expertise
– Thus, scenarios may not always be “realistic”
– But they will include the necessary cues to provide appropriate instruction
Experts don’t just know more, they know differently…
#5: Embrace “Instructional Fidelity” (Cont)
• Instruction built-in to scenarios
• Apply a rich blend of instructional strategies to simulations
– Need to use scenarios in novel ways (beyond situated practice)
– Need to embed appropriate pedagogical strategies within systems
– Enables more effective training
• Embed educational experiences within the scenario
– Need to reconceptualize “Scenarios”  “Instructional Scenarios”
– Need to focus on the development of expert mental models
– Enables more efficient, situated training
Instructional scenarios are synthetic experiences designed to move a person
from one level of understanding to the next…
Thank you!
Sae Schatz
[email protected]