The Portal Assessment Design System

Download Report

Transcript The Portal Assessment Design System

ECD as KR*
Robert J. Mislevy, University of Maryland
Roy Levy, University of Maryland
Eric G. Hansen, Educational Testing Service
(builds on work with Linda Steinberg and Russell Almond)
March 6, 2003
* Evidence-centered design as knowledge representation
May 2, 2001
University of Maryland
Slide 1
Knowledge Representations

A knowledge representation (KR) is a structure for
expressing, communicating, and thinking about important
entities and relationships in some domain.
• Maps, wiring diagrams, physics equations, nested lists.
• Object models, for business systems and computer systems.
• Evidence-centered design models & structures

KRs are surrogates for something else--a real world situation,
or a class of situations, or a representation in other KRs.
• They capture some entities and relationships, but ignore others.
• The included entities, relationships, and processes are the ontology of
the KR -- what kinds of things you think about, and how.
May 2, 2001
University of Maryland
Slide 2
KRs are useful when they highlight important
relationships and make them easier to work with.

KRs facilitate analogies across problems and domains.
• In what ways are AP Studio Art, the SAT, Hydrive, and a language
proficiency oral interview alike?

KRs make it easier to acquire and structure information.
• E.g., ECD design process in ETS Teaching & Learning programs

KRs can facilitate working together.
• ECD object model for sharing, re-using, repurposing the elements
and processes in assessments.

KRs are significant in planning.
• What will a solution have to look like? What elements in assessments
can vary substantially, but what relationships must hold?

Overlapping KRs coordinate work in complex systems.
• Multiple ECD KRs, with bridges among them, for different, interrelated
parts of assessment (substance to argument to specs & models to
operation to reporting).
May 2, 2001
University of Maryland
Slide 3
*** Warning -- cognitive overload ***
May 2, 2001
University of Maryland
Slide 4
Where you usually start:
What are all the kinds of things that are
important to know and do, when and
how? What does good work look like?
Not generally organized according to
assessment arguments.
(In ECD, “domain analysis”)
KRs: Idiosyncratic from domains, as
evolved to suit domain purposes.
May 2, 2001
University of Maryland
Slide 5
Where you usually want to go:
Operational assessment system: Pieces and
processes that gather, evaluate,
and report, to achieve assessment purpose.
(In ECD, “assessment delivery system.”)
KRs: Object model for delivery system.
May 2, 2001
University of Maryland
Slide 6
How do you get from here to there? That is, from
knowledge about the domain, to objects and
processes that meet the purposes you had in mind?
May 2, 2001
University of Maryland
Slide 7
What’s in between (1):
Assessment argument: What knowledge,
skill, accomplishments, etc., of students do
you want do draw inferences about? What
do you need to see them say, do, or make?
What circumstances can evoke this
evidence? (Messick, 1984)
KRs: Toulmin & Wigmore diagrams
%% EH: How about:
What’s in between (part 1)
.. Just a thought..
May 2, 2001
University of Maryland
Slide 8
What’s in between (2):
Organizing argument in KRs that presage the
structure of assessment elements and
processes. Still substantively meaningful.
(In ECD, “Domain Modeling”)
KRs: ETS “paradigms”; T&L forms; PADI
Design Patterns; Bayes nets for arguments;
BEAR construct map structure.
May 2, 2001
University of Maryland
Slide 9
What’s in between (2, continued):
Also hasWhat’s
implications
for what
in between
(2):
informationargument
you needinbefore
Organizing
KRs that presage
administration
of the elements and
the structure
of assessment
assessment,
andsubstantively
how you canmeaningful.
processes.
Still
interpret
results.Modeling”)
(In
ECD, the
“Domain
KRs: ETS “paradigms”; T&L forms; PADI
Design Patterns, Bayes nets for arguments.
May 2, 2001
University of Maryland
%% EH: Very
good point for
disability
access..
Slide 10
Need to establish
correspondence between the
common assessment KRs and
the domain-specific KRs that
address key entities and
relationships from that domain,
as they need to be organized into
the assessment argument, thence
assessment structures. E.g.,
What’s in between (2):BioKIDS’ structure/demand
Organizing argument in KRs that
presage
matrices
and FOSS’s filled-in
the structure of assessment elements and
construct maps
processes. Still substantively meaningful.
(In ECD, “Domain Modeling”)
KRs: ETS “paradigms”; T&L forms; PADI
Design Patterns, Bayes nets for arguments.
May 2, 2001
University of Maryland
Slide 11
What’s in between (3):
Models and specifications for operational
elements and processes.
(ECD “Conceptual Assessment Framework”)
KRs: Student, Evidence, & Task models; Bayes nets;
Measurement-model equations; Task Templates;
Generalized rubrics, scoring algorithms
May 2, 2001
University of Maryland
Slide 12
The upshot: Work through KRs, get machinery
that embodies the substantive assessment
argument, to meet the purposes you had in
mind.
May 2, 2001
University of Maryland
Slide 13