Constructed Response Items - Michigan Assessment Consortium

Download Report

Transcript Constructed Response Items - Michigan Assessment Consortium

Tuning Up your
Common Assessments
Michigan School Testing Conference
February 21, 2012
Dr. Ed Roeber
Kim Young
Dr. Ellen Vorenkamp
Let’s speculate
about the people
in the room 
What one question
might you ask to
explore your notion?
Who Are We?






Next 5 minutes, circulate around room
Name, professional role, district
Ask your question without comment or
clarification and record data
Analyze data
What assumptions might you make about
people in the room?
To what extent did your question give you
the data you were looking for?
Who Are We?

Participants will recognize the need for
quality classroom assessments including
elements such as:
◦
◦
◦
◦
◦

Standard/Item Alignment
Balance of Representation
Target/Method Match
Quality Items
Test Blueprints
Participants will reflect on and modify
(where needed) current assessments
Outcomes
Setting the stage…
Table activity
Protocol – Chalk Talk
Center of chart paper write
Quality Assessments
Without comment…
What are your hunches about
the need to build high quality
assessments?

Think…Pair…Share
◦ What elements are necessary to assure quality
common assessments?
 List these qualities
 Discuss why these are important
Key Questions

Validity Checklist
◦
◦
◦
◦
◦
Standard Alignment
Balance of Representation
Target/Method Match
Quality Items
Test Blueprints
Rubric Review

Activity
◦ Break it apart…see what you have…
Deconstructing Assessments
Are the assessment items tightly aligned
with the standards?
 Are there an equal number of items per
standard? If not, is there “rationale”?
 Are there enough items per standard to
determine mastery?

Deconstructing Debrief
Please return in
15 minutes
Break

Knowledge – facts and concepts we want
students to know

Reasoning – using what they know to
reason and solve problems

Skills – students use their knowledge and
reasoning to act skillfully

Products – use knowledge, reasoning, and
skills to create a concrete product
Kinds of Learning Targets

Selected Response/Short Response
◦ True/false, multiple-choice, matching, fill-inthe-blank, short answers

Extended Response
◦ Essays, research reports and lab reports

Performance
◦ Public performances, investigations

Personal Communication through
conversation/observation
◦ Oral exams, interviews, discussion groups
Method of Assessment
Target-Method Match
How well does your method of assessment match your target?
Target to be
Assessed
Knowledge
Reasoning
Performance
Skills
Products
Assessment Method
Selected
Response/
Short-Response
Extended Response
Performance
Assessment
Personal
Communication
With an “elbow” partner…..
 TMM Chart – fill in the grid
 Which way may be best?

◦ Good match
◦ Partial match
◦ Not a good match
Target-Method-Match
Target-Method Match
How well does your method of assessment match your target?
Assessment Method
Target to be
Assessed
Selected
Response/
Short-Response
Extended Response
Performance
Assessment
Personal
Communication
Knowledge
Good match
Good match
Not a good
match
Partial match
Reasoning
Partial match
Good match
Good match
Good match
Performance
Skills
Not a good
match
Not a good
match
Good match
Partial match
Products
Not a good
match
Partial match
Good match
Not a good
match
Target Method Match

In looking at items on
your assessment, might
there be an assessment
method that could better
capture evidence of
student understanding of
a standard?

What will you stay mindful
of as you rethink or
develop assessment items
to assess standards?
Quality Items
Remember – the development
of good items takes time and
careful thought
General Item Writing
Guidelines
Parts of a Multiple-Choice
Item
Stem
What is the perimeter of a
rectangular vegetable garden with
dimensions 6 feet by 8 feet?
Correct answer
(Key)
Distractors
(Incorrect Options
or Foils)
A
B*
C
D
48 ft
28 ft
24 ft
14 ft
22
1.
2.
3.
4.
5.
6.
7.
Align items to a standard
Target the appropriate Depth of
Knowledge
Use clear, concise language
Use correct grammar
Use appropriate reading level
Avoid the use of the word “you” and “I”
Avoid using synonyms within the item
General Guidelines
Avoid unnecessary complexity
9. Don’t assume prior knowledge
10. Remember: Formatting matters: font
sizes, distractor placement, etc.
8.
General Guidelines
Guidelines About
Writing Stems
Two Types of Multiple
Choice Stems


Open-ended statement, followed
by (usually) 3 or 4 answer choices
Closed question, followed by
(usually) 3 or 4 answer choices
26
Examples
Open-ended stem
One of the factors of x2 – 5x – 36 is
___ A x + 3
B x-4
C x+6
D* x - 9
Closed question stem
Which of the following is a factor of x2 – 5x – 36?
A
B
C
D*
x
x
x
x
+3
-4
+6
-9
27
Multiple Choice Items
11.
12.
13.
14.
15.
Stuff the stem
Avoid redundancy
Avoid the use of negatives
Avoid clues in the stem
Ensure lead materials are essential to
the item
General Guidelines
28
Stems With a Graphic/Stimulus
Lead
The stem and leaf plot gives the ages of
the people who answered survey
questions after buying a pair of roller
blades on an Internet auction.
Stem
1
2
3
4
5
6
7
Question
Leaf
78888
01335
467
33579
45
7
9
2
1
9
2
03
Key:
3 2 means 32
What is the median age of the people who
answered the survey questions?
29
Guidelines for Writing
Response Options
Parts of a Multiple Choice
Item
Stem
What is the perimeter of a
rectangular vegetable garden with
dimensions 6 feet by 8 feet?
(Key)
Correct answer
Distractors
(Incorrect Options)
A
B*
C
D
48 ft
28 ft
24 ft
14 ft
31
16.
17.
18.
19.
20.
Use direct, clear terminology
Use plausible distractors/foils
Use equal length and detail
Make all distractors equally
attractive
Organize the options
General Guidelines for Writing
Response Options
32
21.
22.
23.
24.
25.
Have only one correct answer
Do not use overlapping answers
Vary placement of option choices
Good Items are fair items
Avoid using “All of the Above” and
“None of the Above”
General Guidelines
33
Constructed Response
Items
A constructed response item is an
assessment item that asks students to
apply knowledge, skills, and/or critical
thinking abilities to real-world, standards
driven performance tasks.
 It requires a brief written response from
students. They often have several parts.
Students have to write, draw, and/or
explain their answers.

Constructed Response Items
Sometimes called “open-response” items,
constructed response items are so named
because they ask students to use their
own thinking and background knowledge
to develop answers without the benefit of
any suggestions or choices.
 Constructed response items often have
more than one way to correctly answer
the question.

Constructed Response Items

Constructed Response items are good to
use when you want students to:
◦
◦
◦
◦
◦
◦
◦
Show their work
Explain a process
Compete a chart
Perform a geometric construction
Construct a graph
Identify patterns
Write an essay
Constructed Response Items
37
Tie constructed response items to higherlevel objectives.
 This type of item is good to use when you
want to test a skill that can’t be easily
measured with a selected-response item.

HOTS
Constructed Response Items
38

Two primary types of constructed
response items:
◦ Brief Constructed Response
◦ Extended Constructed Response
Constructed Response Items
Require about 1-3 minutes of student
response time
 Usually represented by one of the
following 5 formats:

◦
◦
◦
◦
◦
Fill in the blank
Short Answer
Label a diagram
Visual representation
Show your work
Brief Constructed Response Items
Extended response items require students
to provide evidence of understanding
regarding a situation that demands more
than a selected response or brief
constructed response.
 They usually involve 20-30 minutes of
student response time

Extended Response Items

May require students to reflect and
respond in a variety of contexts, such as:
 Write an essay from a prompt
 Take a position on a specific topic and support
their stance
 Solve a problem
 Respond to findings of an investigation and/
or experiment
 Respond to written text
Extended Response Items

Guidelines
◦ Carefully word directions and prompts
◦ Allow sufficient time for completion
◦ Have resources necessary for item completion
on hand and ready for use
◦ Share with students elements/characteristics of
a successful response, where appropriate
Extended Response Items
When designing common assessments,
use a variety of brief constructed
response items…(these could include
short answers, fill-in-the-blank, showyour-work and visual representations) as
well as extended constructed response
items.
 Be sure they are aligned to appropriate
(usually higher-level) learning targets

Constructed Response Items
44

The item should be clear and specific
about what students should do.

A Constructed response item may have
several questions.

Allow for more than one way for students
to respond.
Constructed Response Items
45

Include necessary visual representations
such as charts, graphs, pictures, short
readings, and cartoons.

Determine points possible for each item.
Constructed Response Items
46

Usually constructed response items are
worth 2 or more points depending on the
difficulty of the item and the task being
performed.

Design a scoring protocol, based on the
number of points possible, for each
constructed-response item.

Scoring protocols are typically specific to
each individual item
Constructed Response Items

Dot Activity
◦ Green= Item is good to go
◦ Yellow = Item may need to be modified
◦ Red = Item is not well-written and needs to be
scrapped
Quality Item Hunt
Assessment Blueprints

Did you develop your assessment
blueprint prior to developing your
common assessment?
◦ Why is this desirable?
Have you reviewed or modified your test
blueprint during the development
process?
 Does your or will your assessment reflect
your intended blueprint?

Reflective Questions

Activity
◦ Put it back together…make changes as
needed…
Reassemble Assessments
Now What? Next Steps

Ticket out the door…
Wrap Up; Evaluation

Dr. Ed Roeber, Michigan State U
[email protected]
517.432.0427

Dr. Ellen Vorenkamp, Wayne RESA
[email protected]
734.334.1318

Kimberly Young, MDE/BAA
[email protected]
517.373.0988
Contact Information