Assessment - Dallas Independent School District

Download Report

Transcript Assessment - Dallas Independent School District

Assessment
“We assess students not
merely to evaluate them, but
to improve the entire process
of teaching and learning.”
- Douglas B. Reeves, Making Standards Work
Learning Objective

Participants will be able to create more
rigorous assessments by analyzing the
content and cognitive level of state released
test questions.
Why Assessment Training?
Teacher and campuses are overwhelmed
with the number of assessments they have
to give
 How do we target High Priority TEKS?
 How is a good unit assessment
constructed?
 Individual teacher assessments should
follow the guidelines of quality assessment

Agenda
Six Steps to Creating Quality
Assessments
1. Data
2. Content
3. Processes
4. Review Examples
5. Item & Test Leveling
6. Item Design
Look at the Data
 Review
the District Learning Profile.
 Which
SEs are the lowest?
 Which
SEs had the most questions?
Analyzing the content , concepts, processes,
and skills that will be assessed
(Cognitive and Procedural Knowledge)
TEKS and Assessment: Things to Remember

The wording of the standard tells us:
WHAT CONTENT will be assessed on STAAR
and
 AT WHAT LEVEL the standard will be assessed
on STAAR

Cognitive and Content
Expectations

Content


The content items for which students must demonstrate
understanding at the appropriate cognitive level in order
to adequately meet the standard.
Cognitive


The level at which students are expected to perform in
order to adequately meet the standard.
Determined by the verbs used in BOTH the Knowledge
and Skills statements and the Student Expectations
What Should Students be able to DO ?
Look at the cognitive level of the verb
• If the cognitive level of the SE is
UNDERSTAND, what does that mean
students have to be able to do?
Understanding: Constructing meaning from different types of
functions be they written or graphic messages activities like
interpreting, exemplifying, classifying, summarizing, inferring,
comparing, and explaining.
ELAR – Grade 8
8.4 Comprehension of Literary Text/Poetry.
Students understand, make inferences and draw
conclusions about the structure and elements of poetry
and provide evidence from the text to support their
understanding.
The student is expected to:
– (A) compare and contrast the relationship between the
purpose and characteristics of different poetic forms
(e.g., epic poetry, lyric poetry). Supporting Standard
ELAR – Grade 8
Figure 19 Reading/Comprehension Skills.
Students use a flexible range of metacognitive reading
skills in both assigned and independent reading to
understand an author’s message. Students will continue to
apply earlier standards with greater depth in increasingly
more complex texts as they become self-directed, critical
readers. The student is expected to:
– (D) make complex inferences about text and
use textual evidence to support understanding
Bundling and Dual Coding
Brain Research and Dual Coding
Application of Knowledge and Building Schema
 The brain learns new knowledge (content) by
attaching that knowledge to existing schema
 The brain builds schema by applying (do)
conceptual and content knowledge in a variety
of novel ways
 You can most effectively test conceptual
knowledge through application questions
STAAR and Dual Coding
 Math =
75% of items
 Science = 40% of items
 Social Studies = 30%
 English = 60%???
 Figure
19
As we review the following examples, think
about the implications of test and item design
ELAR: Dual Coding with Figure 19
ELAR: Dual Coding with Figure 19
Your Turn:
ACTIVITY
Look at the STAAR released items in your
handout.
 Identify what content is being assessed.
 Identify the skill being assessed.
 Identify how the student has to apply
content with the selected skill.


Discuss with your group.
STAAR Overview:
Generalizations







Verbiage in the questions is complex and reflective of the
TEKS
Supporting Standards REALLY are being tested!
More questions at a higher level
Inference from stimulus; no direct answers (no clues or
cues)
Complex distracters - 4 viable answer choices
Dual coding is prevalent
Greater depth of content knowledge required to answer
each question
Creating an Assessment Blueprint
Step 5:
Determining Cognitive and
Procedural Difficulty Levels
Aligning Item and Test
Difficulty to STAAR
Determining Difficulty Level
Cognitive Difficulty (verbs)

EASY



MEDIUM



Remember
Understand
Apply
Analyze
HARD


Evaluate
Create
Determining Difficulty Level
Procedural Difficulty
This is basically: How many mental
processing steps does the student have to
go through to answer the question?
The greater the number of processing steps
the higher the difficulty level
Determining Difficulty Level
Procedural Difficulty

EASY


MEDIUM


The item includes only the stem and the answer choices
The item includes a graphic, short reading selection, map
etc. (Stimulus piece) The student only has to interpret the
stimulus or pull information from it to select the correct
answer.
HARD

The item includes a graphic, short reading selection, map
etc. (Stimulus piece) The student has to infer, analyze,
summarize, etc. and apply that to the stem or answer choices
to select the correct answer.
Your Turn: Determining Item Difficulty
1.
2.
3.
Look at the following items
Identify the cognitive difficulty of the
item
Identify the procedural difficulty of the
item
Determining Item Difficulty
Determining Item Difficulty
Item Writing Checklist
Demonstration of Learnig

After having determined the cognitive and
procedural difficulty of a test item, learners
will rewrite the question so that it is either
up a level or down a level.
Creating an Assessment Blueprint
Identify the process/skills SE that you are
pairing with the content SE
 Write these on the Blueprint
 Use a variety of processes and skills on the
test and throughout the year.

Design Your Test So….
How many items per SE?
 Cognitive Level- Make 70% of the test easymedium and 30% difficult.
 Procedural Level- Make 60% easy
procedural and 40% difficult.
 Use same vocabulary throughout.

Next Steps: What are the implications
of today’s learning for your campus?

What are things you will change on your
assessments?

What are things you are still pondering?

What is something you want to know more
about?
Bonus Slides: Examples of
Higher Level Stems

Which is an example of…

Who would most likely have written (asked, said…) followed by a quote

Analogies

Fill in the missing part of this graphic organizer

Which of the following does not belong?

What is the best category for the following?

Give a series of clues to the name of a person, place, etc. (riddle format)

Cloze passage

Hypothetical situation – analysis

Sequencing by Roman numeral

Who would have been helped by (law, invention, organization –
hypothetical)
Use a variety of stimuli











Textbox of primary source material
Graphs and Charts
Pictures or illustrations
Spoke Diagrams
Multiple visuals: comparison maps, charts
Political Cartoons
Time lines
Flow Charts
Graphic designs
Headlines
Speaker Questions
Qualities of a Good MultipleChoice Item








Effectively written stems
Focused on one idea
Clear and concise
Phrased as a direct question
Phrased positively
Has only one best and correct answer
Contains options that are plausible, but incorrect
Has four choices that are homogeneous, parallel
in structure, and logical
Qualities of a BAD MultipleChoice Item







Tests trivial information
Contains unnecessary/confusing information in the
stem or options
Is tricky or cute
Gives cues or clues to the correct answer
Does not have plausible answers
Poses a question for which many defensible
answers are possible
Contains a bias toward or against a group of
individuals
Develop plausible distractors
Plausible distracters include…




common misconceptions
statements that sound logical but that are not based on the
material given in the stimulus and question
common or familiar phrases
statements that are true but that do not answer the
question
Be sure there is not more than one plausible answer.
Organization of Answer
Choices
Alphabetical
 Ascending or descending order
 Shortest to longest
 Same order as presented in stimulus
