Transcript here
Setting Up Learning Objectives and Measurement for Game Design Girlie C. Delacruz and Ayesha L. Madni
Serious Play Conference
Los Angeles, CA – July 21, 2012
Overview
Assessment Validity Components of Assessment Architecture Create assessment architecture (Your Example)
What is so hard?
What are some of your challenges?
Passed the Game
Gameplay Domain Log data
Challenges We Have
• • Translating objectives into assessment outcomes – Purpose of assessment information – Communication between designers and educators Game is developed—need to assess its effectiveness – Cannot change code, wraparounds
How can we meet the challenge?
Front-end Efforts Support Effectiveness
Instructional requirements Assessment requirements Technology requirements
Model-Based Engineering Design
Communication Collaboration
Model-Based Engineering Design
z
Part One
ASSESSMENT VALIDITY
What Is Assessment?
Assessment (noun) = Test
Assessment As A Verb
ASSESSMENT = Process of drawing reasonable inferences about what a person knows by evaluating what they say or do in a given situation .
Games As Formative Assessment
Formative Assessment:
Use and interpretation of task performance information with intent to adapt learning, such as provide feedback. (Baker, 1974; Scriven, 1967).
Games As Formative Assessment
Games as Formative Assessment:
Use and interpretation of game performance information with intent to adapt learning, such as provide feedback.
What is Validity?
Assessment Validity as a Quality Judgment
Critical Analysis Legal Judgment Scientific Process
Assessment Validity
ASSESSMENT VALIDITY = Bringing evidence and analysis to evaluate the propositions of interpretive argument.
(Linn, 2010)
How Does This Relate to Design?
① Identification of the inferences to be made.
•
What do you want to be able to say?
② Specificity about the expected uses and users of the learning system.
• •
Define boundaries of the training system Determine need for supplemental resources
③ Translate into game mechanics ④ Empirical analysis of judgment of performance within context of assumptions.
What do you want to be able to say about the gameplayer(s)?
• • • • Player mastered the concepts.
How do you know?
Because they did x, y, z (player history) Because they can do a, b, c (future events)
Identify Key Outcomes: Defining Success Metrics
• • Quantitative Criteria (Generalizable) – – % of successful levels/quests/actions Progress into the game – Changes in performance • • Errors Time spent on similar levels • Correct moves Qualitative Criteria (Game-specific) – – Patterns of gameplay Specific actions
o 1 speed o 2 pre 1 pre 2 o 3 o 4 pre 3 motion directio n o 5 pre 4 o 6 pre 5 duration o 7 o 8 • • • •
BACKGROUND LAYER
Prior knowledge Game experience Age, sex Language proficiency
CONSTRUCT LAYER
Construct, subordinate constructs, and inter dependencies
INDICATOR LAYER
Behavioral evidence of construct
f
n (e 1 , e 2 , e 3 , ...; s 1 , s 2 , s 3 , ...): Computes an indicator value given raw events and game states Game events and states (e 1 , e 2 , e 3 , ...; s 1 , s 2 , s 3 , ...)
FUNCTION LAYER
Computes indicator value
EVENT LAYER
Player behavior and game states
General Approach
• • • Derive structure of measurement model from ontology structure Define “layers” – Background: Demographic and other variables that may moderate learning and game performance – Construct: Structure of knowledge dependencies – Indicator: Input data (evidence) of construct – Function: Set of functions that operate over raw event stream to compute indicator value – Event: Atomic in-game player behaviors and game states Assumptions – Chain of reasoning among the layers are accurate
Part Two
ASSESSMENT ARCHITECTURE
Components of Assessment Architecture
• • • DOMAIN REPRESENTATION instantiating domain-specific related information and practices guides development allows for external review • • COGNITIVE DEMANDS defines targeted knowledge, skills, abilities, practices domain-independent descriptions of learning • • • TASK SPECIFICATIONS defines what the students (tasks/scenarios, materials, actions) defines rules and constraints) defines scoring
Cognitive Demands
What kind of thinking do you want capture?
• Adaptive, complex problem solving • Conceptual, procedural, and systemic learning of content • • • • • • Transfer Situation awareness and risk assessment Decision making Self-regulation Teamwork Communication
Domain Representation
• • External representation(s) of domain specific models Defines universe (or boundaries) of what is to be learned and tested
Example: Math
Ontologies Knowledge specifications Item specifications
Task Specifications
① ② ③ ④ • Operational statement of content and behavior for task Content = stimulus/scenario (what will the users see?) • Behavior = what student is expected to do/ response (what will the users do?) Content limits • • Rules for generating the stimulus/scenario posed to the student Permits systematic generation of scenarios with similar attributes Response descriptions Maps user interactions to cognitive requirements
Force and Motion
Pushes and pulls, can have different strengths and directions. Pushing and pulling on an object can change the speed or direction of its motion and can start or stop it. Each force acts on one particular object and has both strength and a direction. Energy The faster a given object is moving, the more energy it possesses NGSS performance expectation Content limits Targeted science and engineering practice(s) Response description Task complexity Available resources
Plan and conduct an investigation to compare the
effects
of different
strengths of pushes
on the
motion
of an
object
(K-PS2-1).
Analyze
the speed data
or to determine if a design solution works as intended to
change direction
of an
object
with a push (K-PS2-2).
Effects:
change in position; increased or decreased acceleration
Strengths of pushes:
or quantitative Qualitative (small, medium, big),
Type of Motion
: Rotational
Constraints on planar objects
: Must be something that can be pushed horizontally and attached to its fulcrum (e.g., the door to a house)
Allowable variations on objects:
width, location of object
Constraints on fulcrum objects
: Must be attached to the planar object; position of fulcrum object cannot be changed
Ask questions
Mass, height and that can be investigated based on patterns such as cause and effect relationships.
Ask questions
: Query the MARI about the properties of the objects (e.g., what is the distance between the hinge and where I pushed) based on observed outcomes (e.g., how hard it was to push the door, or how far the door moved).
Student only has 4 attempts to pass the ball to the girl and can only vary position and strength of push.
Data:
distance, slope, time, speed
Speed change
: increase in acceleration
Direction
: Vertical movement C
onstraints on planar objects
: Must be something flat (e.g., book, frame, ruler) that can be placed on another object and can be pushed in a downward movement
Allowable variations on planar objects
: Mass, height and width, location of object in the room, surface material
Constraints on fulcrum objects
: The structural properties of the fulcrum should support some, but not all of the set of planar objects; position of fulcrum object can be changed
Use observations
to describe patterns and/or relationships in that natural and designed world(s) in order to answer scientific questions and solve problems.
Use observations
: use snapshot images of activity in the HRLA with overlaid measurement data generated by the MARI to sort situations based on the physical features, behaviors, or functional roles in the design. Easy: Student can vary the position and strength of the push, but must apply force by placing additional objects on the planar object and pushing downward with both hands (to connect the kinesthetic experience of applying the force with hands on experience of the object). Harder: Student can vary both the position and strength of the push and how the planar object is placed on the fulcrum (e.g., load is moved closer or further away from fulcrum) Iconic and graphical representation of underlying physics laws will be on the screen, and will change based on student actions. Guided questions will ask students about distance, mass, force magnitude and direction, height, and slope based on observed outcomes.
Components of Computational Model
Components of Decision Model
Courses of Action Do nothing:
move on, end task
Get more evidence or information:
repeat same task, perform similar task, ask a question
Intervene (instructional remediation):
give elaborated feedback, worked example or add scaffolding, more supporting information
Intervene (task modification):
new task (reduced or increased difficulty), new task (qualitatively different)
Components of Decision Model
Decision Factors
Confidence of diagnosis : How certain are we about hypothesized causal relation?
Consequence of misdiagnosis: What happens if we get it wrong? What are the implications of ignoring other possible states or causal relations?
Effectiveness of intervention: How effective is the intervention we will give after diagnosis?
Constraints: Do we have to efficiency concerns with respect to time or resource constraints?
Part Three
ASSESSMENT ARCHITECTURE (YOUR EXAMPLE)
Assessment Architecture
Fixed Variables Task characteristics + Context (test, simulation, game) + Person (prior knowledge and experience)
Assumptions and Design Rationale
36
Assessment Architecture
Fixed Variables Task characteristics + Context (test, simulation, game) + Person (prior knowledge and experience) Observed Event(s) What happened?
(Raw data, scored information?) Performance to be Assessed
37
Assessment Architecture
Fixed Variables Task characteristics + Context (test, simulation, game)
Judgment of performance
information?) + Person (prior knowledge and experience) Translation What does this mean?
38
Assessment Architecture
Fixed Variables Task characteristics + Context (test, simulation, game) + Person (prior knowledge
Assessment Validation
Observed Event(s) What happened?
(Raw data, scored information?) Translation What does this mean?
Inferences What are the potential causes of the observed events?
Characteristics of the task?
Context?
Lack of Knowledge?
Not sure?
39
Potential Course of Actions
No intervention Get more evidence or information Intervene Move On End Task Repeat Same Trial Perform Similar Task Ask a questio n Instructional Remediation Give Elaborated Feedback Worked Example Modify Task Add Scaffoldin g More Information New Task With Reduced Difficulty