Electronic Continuity Folders (eCF) Dr. Bob Pokorny Battle

Download Report

Transcript Electronic Continuity Folders (eCF) Dr. Bob Pokorny Battle

CAPTURING PLAYER PERFORMANCE
IN GAMES AND SIMULATIONS
Dr. Bob Pokorny
Intelligent Automation, Inc
GameTech, 2012
Intelligent Automation, Inc.
15400 Calhoun Drive, Suite 400
Rockville, MD 20855
www.i-a-i.com
Topic
“Capturing and Evaluating Player Performance in Games and Simulations”
Abstract promised:
To adapt to a player’s need, assess performance
Assessing performance can be difficult—so many paths in complex domains
We’ll walk through one method to define scoring rules in open ended games
Assessment should answer question, “Did trainees learn?” from
– sponsors
– critics
– potential new customers
Assessment should help students learn
– Build better training based on assessment and adaptation
7/17/2015
Proprietary, Intelligent Automation, Inc.
2
Agenda
Know how to answer common questions about
assessment in games
– Develop a checklist
Know about an excellent method to assess performance
in complex games
– Knowing that answering questions about trainee performance in
open-ended simulations is cheap and easy
7/17/2015
Proprietary, Intelligent Automation, Inc.
3
Ground Rules for presentation
To learn: you’ll need to
Seek answers to questions that are important for you
7/17/2015
Proprietary, Intelligent Automation, Inc.
4
Reasons you should ask yourself questions
7/17/2015
Proprietary, Intelligent Automation, Inc.
5
Learning Shots
7/17/2015
Proprietary, Intelligent Automation, Inc.
6
Study/ Think / Engage to learn efficiently
7/17/2015
Proprietary, Intelligent Automation, Inc.
7
Learning is Big
7/17/2015
Proprietary, Intelligent Automation, Inc.
8
Agenda
Know how to answer common questions about
assessment in games
– Develop a checklist
Know about an excellent method to assess performance
in complex games
– Knowing that answering questions about trainee performance in
open-ended simulations is cheap and easy
7/17/2015
Proprietary, Intelligent Automation, Inc.
9
Common Questions about training effectiveness
Questions you should be able to answer about assessing your
game
I’ll give you some clue about a dimension to consider
You answer how your game evaluation does on that dimension
7/17/2015
Proprietary, Intelligent Automation, Inc.
10
Dimension 1
7/17/2015
Proprietary, Intelligent Automation, Inc.
11
Dimension 1
ReLe BeRe
7/17/2015
Proprietary, Intelligent Automation, Inc.
12
Dimension 1
ReLe BeRe
Kirkpatrick’s 4 levels of evaluation
REaction: Did trainees like the training
LEarning: Did trainees learn?
BEhavior: Did trainees behavior improve?
REsults: Did organizational results improve?
For Any evaluation, you’ll want to specify what Kirkpatrick level its at.
7/17/2015
Proprietary, Intelligent Automation, Inc.
13
Checklist
Kirkpatrick Level
7/17/2015
RE
LE
BE
RE
Proprietary, Intelligent Automation, Inc.
14
Dimension 2
Is learning here
applied here?
Does learning transfer from game to target environment?
7/17/2015
Proprietary, Intelligent Automation, Inc.
15
Dimension 2
Does learning generalize from
context in which its taught to other
contexts?
7/17/2015
Proprietary, Intelligent Automation, Inc.
16
Dimension 2
Is learning retained over time?
7/17/2015
Proprietary, Intelligent Automation, Inc.
17
Checklist
Kirkpatrick Level
Transfer
RE
– From trainer to job
– From one environment to another
– From present to the future
7/17/2015
LE
BE
RE
Yes
Yes
Yes
Proprietary, Intelligent Automation, Inc.
18
Dimension 3
Do you collect data from inside or outside the game?
7/17/2015
Proprietary, Intelligent Automation, Inc.
19
Checklist
Kirkpatrick Level
Transfer
RE
– From trainer to job
– From one job environment to another
– From present to the future
Collect data inside or outside game
7/17/2015
LE
BE
RE
Yes
Yes
Yes
Inside
Outside
Proprietary, Intelligent Automation, Inc.
20
Dimension 4
Do you collect outcome or process data?
7/17/2015
Proprietary, Intelligent Automation, Inc.
21
Dimension 4
Do you collect outcome or process data?
7/17/2015
Proprietary, Intelligent Automation, Inc.
22
Dimension 4
Do you collect outcome or process data?
7/17/2015
Proprietary, Intelligent Automation, Inc.
23
Dimension 4
Outcome measures or process measure
– Outcome measures are the end result
Example: What was your final score in the game. (97)
– Process measures are the moment to
moment behaviors that lead to outcome
Example: Things you did while playing the game
–
–
–
–
7/17/2015
Talk with others
Consult supporting material
Practice tasks repeatedly
Select actions
Proprietary, Intelligent Automation, Inc.
24
Checklist
Kirkpatrick Level
Transfer
RE
– From trainer to job
– From one job environment to another
– From present to the future
Collect data inside or outside game
Collect outcome or process data
7/17/2015
LE
BE
RE
Yes
Yes
Yes
Inside
Outside
Outcome
Process
Proprietary, Intelligent Automation, Inc.
25
Dimension 5
Must be fast
7/17/2015
Can be slow
Proprietary, Intelligent Automation, Inc.
26
Dimension 5
Fast performance required, or slow processing
– Kahneman (2010) describes fast and slow processing
– Rasmussen et. al. (1994) talks about Automatic (<few second); RuleBased (Seconds); Conceptual (minutes to days)
– Schneider and Schiffrin (1977) “Automatic versus controlled”
performance
7/17/2015
Proprietary, Intelligent Automation, Inc.
27
Checklist
Kirkpatrick Level
Transfer
RE
– From trainer to job
– From one job environment to another
– From present to the future
Collect data inside or outside game
Collect outcome or process data
Speed of required performance
7/17/2015
LE
BE
RE
Yes
Yes
Yes
Inside
Outside
Outcome
Process
Fast
Slow
Proprietary, Intelligent Automation, Inc.
28
Dimension 6
7/17/2015
Proprietary, Intelligent Automation, Inc.
29
Dimension 6
One good solution (“well defined”)
7/17/2015
Many good solutions (“Ill defined”)
Proprietary, Intelligent Automation, Inc.
30
Dimension 6
Well defined or Ill defined processes
– One Correct solution versus many correct solutions
Well-defined processes are easier to score
Well defined processes frequently need to be executed quickly
Ill-defined processes have many correct paths
– Many equally good paths
– Some paths better than others
Still many ways to perform poorly
7/17/2015
Proprietary, Intelligent Automation, Inc.
31
Checklist
Kirkpatrick Level
Transfer
RE
– From trainer to job
– From one job environment to another
– From present to the future
Collect data inside or outside game
Collect outcome or process data
Speed of required performance
Problem complexity
7/17/2015
LE
BE
RE
Yes
Yes
Yes
Inside
Outside
Outcome
Process
Fast
Slow
Well defined
Ill defined
Proprietary, Intelligent Automation, Inc.
32
Dimension 7
Rules & Regulations
1.
2.
3.
4.
5.
6.
7.
8.
Swim at your own risk.
Alcohol beverages are not allowed
Dogs are not allowed
Do no swim under docks or rails
Running on the docks is not allowed
Admittance to beach requires permit
Adults must accompany children to toilets
No glass containers
7/17/2015
Proprietary, Intelligent Automation, Inc.
33
Dimension 7
Rules & Regulations
1.
2.
3.
4.
5.
6.
7.
8.
Swim at your own risk.
Alcohol beverages are not allowed
Dogs are not allowed
Do no swim under docks or rails
Running on the docks is not allowed
Admittance to beach requires permit
Adults must accompany children to toilets
No glass containers
Declarative Knowledge
Knowing What
7/17/2015
Procedural Knowledge
Knowing How
Proprietary, Intelligent Automation, Inc.
34
Dimension 7
Declarative Knowledge and Facts
– Learned well with traditional drill and practice
– Not so well learned with most games and simulations
Procedural Knowledge and Processes
– Learned better in immersive environments and games
– Not so well learned from books
7/17/2015
Proprietary, Intelligent Automation, Inc.
35
Checklist
Kirkpatrick Level
Transfer
RE
– From trainer to job
– From one job environment to another
– From present to the future
Collect data inside or outside game
Collect outcome or process data
Speed of required performance
Problem complexity
Declarative versus Procedural
7/17/2015
LE
BE
RE
Yes
Yes
Yes
Inside
Outside
Outcome
Process
Fast
Slow
Well defined
Ill defined
Declarative
Procedural
Proprietary, Intelligent Automation, Inc.
36
Agenda
Know how to answer common questions about
assessment in games
– Develop a checklist
Know about an excellent method to assess performance
in complex games
– Knowing that answering questions about trainee performance in
open-ended simulations is cheap and easy
7/17/2015
Proprietary, Intelligent Automation, Inc.
37
Agenda
Know how to answer common questions about
assessment in games
– Develop a checklist
Know about an excellent method to assess performance
in complex games
– Knowing that answering questions about trainee performance in
open-ended simulations is cheap and easy
7/17/2015
Proprietary, Intelligent Automation, Inc.
38
UrbanSim
Standalone, PC game that teaches “Art of Battle
Command” in a COIN or Stability Operations environment
– Goals: provide simulated practice experience and expose
students to challenges associated with:
Achieving and maintaining situational awareness and understanding
in a complex, COIN environment
Balancing a wide range of direct actions (lethal and non-lethal) in this
type of environment
Anticipating the 2nd/ 3rdorder effects of future decisions and take
those in consideration during one’s planning process
Introducing students to issues associated with Capacity Building Skills
in a COIN environment
Assessing effects and progress over time
7/17/2015
Proprietary, Intelligent Automation, Inc.
39
UrbanSim Development Background
UrbanSim was carefully developed simulation for leaders to learn the Art of
Battle Command; students are placed in Full Spectrum Operations context
Built based on
– Schoolhouse Subject Matter Experts:School for Command Preparation (SCP), Ft.
Leavenworth
– Counterinsurgency instruction, Ft. Riley
– Completed Cognitive Task Analysis (CTA):Extensive interviews with seven former
battalion commanders
– CTA Reviewed by COL Hickey, LTC (R) Nagl, and LTC Potter
Doctrinal References:
–
–
–
–
–
–
–
Full-spectrum Operations (FM 3-0)
Counterinsurgency (FM 3-24)
Stability Operations (FM 3-07)
Information Operations (FM 3-13)
Tactics in Counterinsurgency (FM 3-24.2)
Intel Prep of Battlefield (FM 34-130)
Training the Force (FM 7-0)
IAI Proprietary
40
UrbanSim screenshot
7/17/2015
Proprietary, Intelligent Automation, Inc.
41
Within a Game
Kirkpatrick Level
Transfer
– From trainer to job
– From one job environment to another
– From present to the future
Collect data inside or outside game
Collect outcome or process data
Speed of required performance
Problem complexity
Declarative versus Procedural
7/17/2015
RE
LE
BE
RE
Yes
Yes
Yes
Inside
Outside
Outcome
Process
Fast
Slow
Well defined
Ill defined
Declarative
Procedural
Proprietary, Intelligent Automation, Inc.
42
Agenda
Know how to answer common questions about
assessment in games
– Develop a checklist
Know about an excellent method to assess performance
in complex games
– First review common needs and methods of assessment
performance in complex games
– Second, focus on recommended method to assess performance
7/17/2015
Proprietary, Intelligent Automation, Inc.
43
General Approaches to Scoring within games
Different approaches to assessing performance in games:
1. Use levels
2. Relate performance (behaviors) to
student knowledge (Psychological constructs)
3. Experts review performance to inform scoring rules
7/17/2015
Proprietary, Intelligent Automation, Inc.
44
Use Levels
Simple View:
L6
L5
L4
L3
L2
L1
7/17/2015
First time
Right order
Right ingredients
Proprietary, Intelligent Automation, Inc.
45
Use Levels
Simple View:
L6
L5
L4
L3
L2
L1
Levels are linked to task performance
Specify capabilities needed to move from level to level
Can Assert Improvement in Capabilities
7/17/2015
Proprietary, Intelligent Automation, Inc.
46
Use Levels
Focused on outcome measures
– Did I get shot?
– Did I win the population over to support Host Nation Government?
But process measure are helpful for remediation:
– If I got shot, was I doing what I was supposed to to accomplish mission
and be as safe as possible
– How could I have won over population?
Focus on outcomes makes remediation more difficult
7/17/2015
Proprietary, Intelligent Automation, Inc.
47
General Approaches to Scoring within games
Different approaches to assessing performance in games:
Use levels
Relate performance to student knowledge
Experts review performance to create scoring rules
7/17/2015
Proprietary, Intelligent Automation, Inc.
48
Relate Performance to Student Knowledge
Psychological constructs
Actions
Nodes of knowledge: Each action adjusts estimates of knowledge
7/17/2015
Proprietary, Intelligent Automation, Inc.
49
Relate Performance to Student Knowledge
How to change student knowledge scores from actions
Student
Variables
Time 1
7/17/2015
Instruction
Time 2
Instruction
Time 3
Performance
Time 2
Performance
Time 3
Student
Variables
Time 2
Student
Variables
Time 3
Proprietary, Intelligent Automation, Inc.
50
Relate Performance to Student Knowledge
Relate actions to knowledge
– What are the knowledge nodes
– How do the actions lead to changes in estimated
knowledge?
– Once you estimate student knowledge, how do you use it
for scoring performance or assigning instruction
7/17/2015
Proprietary, Intelligent Automation, Inc.
51
General Approaches to Scoring within games
Different approaches to assessing performance in games:
Use levels
– Students are learning if they progress up levels
– This approach is better if levels are tied to standards of achievement
Relate performance to student knowledge
– Used sometimes in Intelligent Tutoring Systems
– Frequently uses complex statistics to link actions to presumed
knowledge.
Experts review performance to create scoring rules
7/17/2015
Proprietary, Intelligent Automation, Inc.
52
Experts review performance to create scoring rules
Introduce the process
Why we recommend it
Ask you to think about it for your game
Walk through the process in more detail
Compile and address your objections to this process
7/17/2015
Proprietary, Intelligent Automation, Inc.
53
Introduce Performance Evaluation by Expert Review
Performance Evaluation by Expert Review (PEER) captures Expert
Knowledge applied to assessment
– Experts review and report their impressions of students performance
– Analysts transform experts’ reviews into scoring policies and rules
7/17/2015
Proprietary, Intelligent Automation, Inc.
54
Within a Feature: Scoring Policies and Rules
Example policy and rule within security feature:
Policies
Security: Follow a
Clear and Hold
strategy: per Higher’s
intent: begin with
establishing security,
based on current
intel. As security
improves, devote
fewer resources.
7/17/2015
Rules
Security risks are scored as
Bad actors: 2 points per turn for existing each of
three insurgent groups: Al Qassas, JAAS, or Shia
death squads
Attacks: Every attack (IED, Gas station attack) is one
point showing a risk
Security activities:
High points (1.5): Ex: Cordon and Search, Seize cache.
Medium points (1.0) Ex. Cordon and knock,
checkpoint
Low points (.5) Ex. Recruit
Proprietary, Intelligent Automation, Inc.
55
Within a Feature: Scoring Policies and Rules
In UrbanSim: We identified ~25 scoring policies from
experts’ comments
Each scoring rule identifies if trainee is
– Complying with the policy
– Violating the policy
If the trainee violates the policy:
– points are deducted
– the violated policy is a target for remediation
7/17/2015
Proprietary, Intelligent Automation, Inc.
56
Experts review performance to create scoring rules
Introduce the process
Why we recommend it
Ask you to think about it for your game
Walk through the process in more detail
Compile and address your objections to this process
7/17/2015
Proprietary, Intelligent Automation, Inc.
57
Why we recommend PEER
It produces accurate scores
– Scores correlate > .84 with expert scores
– In complex equipment troubleshooting example, scores correlated .88
with expert
The scores are valid
– In the complex equipment troubleshooting example, scores on scenarios
correlated with time on job in the .70s.
PEER has been used in 3 very different contexts:
– Complex equipment troubleshooting—jazillions of possible actions
– Counterinsurgency decision making—dynamic environment
– Air to Air intercept—perceptually rich, speeded decisions
The scoring system can be used to guide remediation
The scoring system is inspectable
The scoring system is affordable
7/17/2015
Proprietary, Intelligent Automation, Inc.
58
Experts review performance to create scoring rules
Introduce the process
Why we recommend it
Ask you to think about it for your game
Walk through the process in more detail
Compile and address your objections to this process
7/17/2015
Proprietary, Intelligent Automation, Inc.
59
Your Slide
What game(s) do you need assessed?
7/17/2015
Proprietary, Intelligent Automation, Inc.
60
Experts review performance to create scoring rules
Introduce the process
Why we recommend it
Ask you to think about it for your game
Walk through the process in more detail
Compile and address your objections to this process
7/17/2015
Proprietary, Intelligent Automation, Inc.
61
Our approach to PEER
1. Collect student work sample; represent samples for expert review
2. Experts judge (rank and score) overall quality of student work
samples
3. Experts verbalize reasons underlying assessments
4. If experts’ holistic scores are reliable, transform reasons into
policies
5. Submit work sample to rules to yield scores; refine as needed
6. Evaluate accuracy of PEER scores on new work samples
62
PEER applied to UrbanSim
Step 1. Collect logged student data; represent for expert review
<GAME_OBJECT
DELETE="false"
DESCRIPTION=""
ETHNICITY="N/A"
OBJECT_ID="G CO b"
IMAGE="GuiIcon_GCOb
" INSTANCE="false"
NAME="G CO b”
Log files is >4 MB
UrbanSim instructors suggested presentation format for student records
IAI Proprietary
63
PEER applied to UrbanSim
Step 2: Experts judge (rank and score) overall quality of student
work samples
98
90
82
75
68
60
IAI Proprietary
64
PEER applied to UrbanSim
Step 3: Experts verbalize reasons underlying assessments
– Example: SME from School of Command Preparation reviewing student record “BA”
Review of Reasons
Period 1
Good – non lethal engagements, establishing relationships,
BnCdr meeting with mayor, good to meet with key leaders
Bad – commander’s intent is civil security, but not doing
security (only 1 unit), not in sync with higher headquarters
65
PEER applied to UrbanSim
Step 4: If expert scores are significantly correlated, transform reasons
into scoring key
Based on six factors identified by SMEs
– Security
– Host Nation Support
– Assist with infrastructure development
– Information Operations
– Meetings with civilian leaders
– Consistency of effort
IAI Proprietary
66
PEER applied to UrbanSim
Example policies and rules:
Policies
Security: Follow a
Clear and Hold
strategy: per Higher’s
intent: begin with
establishing security,
based on current
intel. As security
improves, devote
fewer resources.
7/17/2015
Rules
Security risks are scored as
Bad actors: 2 points per turn for existing each of
three insurgent groups: Al Qassas, JAAS, or Shia
death squads
Attacks: Every attack (IED, Gas station attack) is one
point showing a risk
Security activities:
High points (1.5): Ex: Cordon and Search, Seize cache.
Medium points (1.0) Ex. Cordon and knock,
checkpoint
Low points (.5) Ex. Recruit
Proprietary, Intelligent Automation, Inc.
67
PEER applied to UrbanSim
• Step 5: Code rules, and submit work sample to rules to yield scores
Rules
Security risks are scored as
Bad actors: 2 points per turn for
existing each of three insurgent
groups: Al Qassas, JAAS, or Shia death
squads
Attacks: Every attack (IED, Gas station
attack) is one point showing a risk
Security activities:
High points (1.5): Ex: Cordon and
Search, Seize cache.
Medium points (1.0) Ex. Cordon and
knock, checkpoint
Low points (.5) Ex. Recruit
<GAME_OBJECT
DELETE="false"
DESCRIPTION=""
ETHNICITY="N/A"
OBJECT_ID="G CO b"
IMAGE="GuiIcon_GCOb"
INSTANCE="false"
NAME="G CO b”
Log files is >4 MB
Overall
Security
Host Nation Support
Infrastructure
Info Ops
Meetings
Consistency
IAI Proprietary
82
92
68
86
79
74
90
68
PEER applied to UrbanSim
Step 5: Compare results from scoring key and experts’ scores
Correlation of .84 (p < .0001) between the scoring worksheet
and the average of the three experts.
IAI Proprietary
69
PEER (to be) applied in UrbanSim
Step 6: Apply scoring rules to new data; test accuracy
ACTION
RESULT
ACTION
RESULT
Swao UUT
Fails
ACTION
RESULT
OhmSwao
check
X 0 ohms
UUT
Fails
ACTION
RESULT
OhmOhm
check
Y X0 ohms
check
Swao
UUT 0 ohms
Fails
VDCOhm
RAGcheck
3,11 Y5ACTION
VDC
RESULT
OhmSwao
check
X0 ohms
0 ohms
Fails
VDCVDC
RAGRAG
5511
0UUT
VDC
3,11
5
VDC
OhmOhm
check
Y X0 ohms
check
0 ohms
Swap
Fails
VDCVDC
RAGRAG
5511
0UUT
VDC
3,11
OhmOhm
check
Y5 VDC
check
X0 ohms
0 ohms
VDCVDC
RAGRAG
5511
0 VDC
3,11
Ohm check
Y5 VDC
0 ohms
VDCVDC
RAGRAG
5511
0 VDC
3,11
5 VDC
VDC RAG 5511 0 VDC
ACTION
RESULT
ACTION
RESULT
Swao UUT
Fails
ACTION
RESULT
OhmSwao
check
X 0 ohms
UUT
Fails
ACTION
RESULT
OhmOhm
check
Y X0 ohms
check
Swao
UUT 0 ohms
Fails
VDCOhm
RAGcheck
3,11 Y5ACTION
VDC
RESULT
OhmSwao
check
X0 ohms
0 ohms
Fails
VDCVDC
RAGRAG
5511
0UUT
VDC
3,11
5
VDC
OhmOhm
check
Y X0 ohms
check
0 ohms
Swap
Fails
VDCVDC
RAGRAG
5511
0UUT
VDC
3,11
OhmOhm
check
Y5 VDC
check
X0 ohms
0 ohms
VDCVDC
RAGRAG
5511
0 VDC
3,11
Ohm check
Y5 VDC
0 ohms
VDCVDC
RAGRAG
5511
0 VDC
3,11
5 VDC
VDC RAG 5511 0 VDC
11 work
samples
Scores from
2 SMEs
Create
scoring
rules
Scoring
Rules
Focus on UUT
Jump to TP
Jump to TS
Jump to data
Poor equipment
Off path
Poor interpretation
Apply scoring
Scores from
rules
scoring rules
24 other
work
samples
Correlation = .88
SMEs score
data
Scores from
SMEs
-5
-10
-15
-10
-5
Conclusions about PEER with UrbanSim
PEER is robust
– Sub-optimal data—it worked well with data in log file for software testing
Didn’t collect some student information collection answers
Data collected at Fort Leonard Wood during with experts distracted by tornados
PEER is efficient
– Used an existing log file
– Experts could quickly assign scores and easily give rationale for scores
PEER results in explicit understandable policies
– Scoring policies are explicit and clear.
PEER results can be used for guiding instruction
– Identifies violations of good policies which can become targets of instruction
IAI Proprietary
71
Why PEER works so well
Experts apply their years of experience working with situations like
those in the simulation to assess students’ performance
Experts use a rich set of data
– They can look over a student’s body of work on a problem and infer
intentions and plans (or the lack of them)
– This is easier than making judgments based on less data
Natural language of experts critiques specifies the “grain size” of
the analysis
IAI Proprietary
72
Experts review performance to create scoring rules
Introduce the process (high level)
Why we recommend it (high level)
Ask you to think about it for your game
What this process is (in more detail)
Consider your comments/objections to this process
7/17/2015
Proprietary, Intelligent Automation, Inc.
73
Can you apply PEER to your game?
Why did you think this will work?
What will make this not work?
Comments:
7/17/2015
Proprietary, Intelligent Automation, Inc.
74
Can you apply PEER to your game?
Previous questions:
But my game is sooooo complicated
Experts don’t know players’ intentions
But there’s no simulation yet
My game doesn’t collect data
It’s too simple to guide remediation
7/17/2015
Proprietary, Intelligent Automation, Inc.
75
Agenda
Know how to answer common questions about
assessment in games
– Develop a checklist
Know about an excellent method to assess performance
in complex games
– Knowing that answering questions about trainee performance in
open-ended simulations is cheap and easy
Proprietary, Intelligent Automation, Inc.
76
Acknowledgements
Soldiers and Experts at Ft Leonard Wood and Ft. Leavenworth
ADL
– Kristy Murray and Elaine Raybourn and Steve Hicks
AFRL
7/17/2015
Proprietary, Intelligent Automation, Inc.
77
Next Steps
Apply PEER further
– To assess performance in more games
– For guiding construction of expert model
– For guiding instruction
7/17/2015
Proprietary, Intelligent Automation, Inc.
78
CAPTURING PLAYER PERFORMANCE
IN GAMES AND SIMULATIONS
DR. BOB POKORNY
INTELLIGENT AUTOMATION, INC
301 294-4750
[email protected]
GAMETECH, 2012
Intelligent Automation, Inc.
15400 Calhoun Drive, Suite 400
Rockville, MD 20855
www.i-a-i.com