Study of Alignment Quality among the Pennsylvania

Download Report

Transcript Study of Alignment Quality among the Pennsylvania

What is High-Quality Assessment?
Linking Research with Practice
Santa Clara County Office of Education
June 23, 2014
Karin K. Hess, Ed.D.
[email protected] or
[email protected]
Presentation Overview
• Clarify understandings of cognitive rigor/DOK –
using sample assessments & rubrics
• Use the Hess Validation Tools & Protocols
(Module 3) to examine technical criteria for high
quality assessments: Formative & Performance
• Review tools & strategies to discuss & plan
future assessment activities and support to
teachers
• Karin’s coaching tips…
Rubric Design & Formative Tools
• Revisit Handout from this morning: “What I
need to do” rubric (citing evidence of
proficiency)
• Handout 2a: Find a half
• Handout 2b: Hess Cognitive Rigor Matrix –
Math-Science
• Handout 2c: What will this formative
assessment uncover?
• Work in small groups to analyze the assessment
What do we mean by high-quality performance
assessment?
• At your tables, brainstorm examples of
performance assessments – any content area
(e.g., arts, writing, science) or real world
assessments (driver’s test, marriage planning,
etc.)
• Have a recorder write them down
• You have only 3 minutes
Turn & talk: Select one PA from your list and
answer these questions:
1. What is it actually assessing (skills &
concepts)?
2. What makes it a PA?
3. What evidence is captured in the assessment
that distinguishes poor from best
performances?
4. What makes it a “good” performance
assessment?
5. You have 5 minutes
Let’s generalize…
• With regard to skills & concepts assessed
______
• What makes something a PA? ______
• The kind of evidence that will distinguish
poor from exemplary performances _______
• What makes it a “good” performance
assessment? _________
What we know (from research) about High
Quality Assessment:
• Defined by agreed-upon standards/ expectations
• Measures the individual’s learning & can take
different forms/formats
• Measures the effectiveness of instruction and
appropriateness of curriculum
• Is transparent:
– Students know what is expected of them and how they will
be assessed
– Assessment criteria are clear and training is provided to
educators and reviewers/raters.
• Communicates information effectively to students,
teachers, parents, administration and the public at
large
Simply put, HQ assessments have…
•
•
•
•
•
Clarity of expectations
Alignment to the intended expectations (skills, concepts)
Reliability of scoring and interpretation of results
Attention to the intended rigor (tasks & scoring guides)
Opportunities for student engagement & decision
making
• Opportunities to make the assessment “fair” & unbiased
for all
• Linked to instruction (opportunity to learn)
2. The DOK
Instruction
& Assessment
Matrix Instructional
Decisions…
Paths
Selected Response
Each standard has an assigned Depth
of Knowledge.
Performance Tasks
Constructed Response
DOK 1
DOK 2
Recall and Reproduction
Skills and
Concepts
Remember
Understand
Recall, locate
basic facts,
definitions,
details, events
The DOK determines the cognitive
level of instruction.
DOK 4
DOK 3
Extended Thinking
Reasoning and
Thinking
Select appropriate
words for use when
intended meaning
is clearly evident.
Explain relationships
Summarize
State central idea
Use context for word
meanings
Use information using
text features
Apply
Analyze
Use concepts to solve
non-routine problems and
justify
Analyze or interpret author’s
craft (e.g., literary devices,
viewpoint, or potential bias)
to critique a text
.
Cite evidence and develop a
logical argument for
conjectures based on one text
or problem
Evaluate
Create
Explain, generalize or
connect ideas using
supporting evidence
(quote, text, evidence)
.
Develop a complex model or
approach for a given situation
Develop an alternative solution
-Explain how concepts or
ideas specifically relate to
other content domains.
Devise an approach
among many alternatives
to research a novel
problem
Analyze multiple sources
or multiple text
Analyze complex abstract
themes
Evaluate relevancy,
accuracy and
completeness of
information across texts
or sources
Synthesize across multiple
sources/ texts
Articulate a new voice, theme,
or perspective
9
GOAL: Each “validated” assessment will
demonstrate:
• Clarity of expectations for the student and
teacher(s)
• Alignment (task & scoring) to the intended
standards: content & performance/DOK
• Provide opportunities for student engagement
• Provide opportunities to make the assessment
“fair” & unbiased for ALL students
First we consider alignment…
• It’s really about validity – making decisions
about the degree to which there is a “strong
match” between grade level content
standards + performance and the
assessment/test questions/tasks
• And making valid inferences about learning
resulting from an assessment score
“Validity is a matter of degree, rather than
all or none.”
Robert Lynn, 2008
Alignment (validity) Questions:
• Is there a strong content match between
assessment/test questions/tasks and grade
level standards?
• Are the test questions/tasks (and the
assessment as a whole) more rigorous, less
rigorous, or of comparable rigor (DOK) to
grade level performance standards?
Task Validation Protocol Handout # 3
(K. Hess, 2013)
• Table Groups review the technical criteria
and descriptions on pages 3-4 in the protocol
at your tables
• What’s one aspect you feel you (or teachers
you work with) now do well in most local
assessments?
• What’s one aspect you feel you (or your
teachers) need to understand more deeply as
you work with them?
Uses of the assessment task validation
tools & protocols
• Develop new assessments
• Analyze existing assessments
• Validate a revised assessment or new
assessment prior to broader administration
(or purchase)
• Provide OBJECTIVE feedback to assessment
developers
• Promote collaboration and a shared
understanding of high quality assessment
Local Validation Teams represent the
diversity of the school
• Administrator/Leader/Coach
• All* content areas represented
• All/most grade levels (grade spans)*
represented
• PLUS Representation from special education,
fine arts, HPE, CTE, foreign language, ELL, etc.
*decisions may differ depending on school configurations and staffing, but diversity
in teams is critical, especially including special educators
Frequency of Validations?
• Initially learning & debriefing the process
together serves as calibration - so everyone is
on the same page – “developing a shared
understanding” of what high quality
assessment looks like
• School teams set up their schedules – once
each month, every other month, as needed,
highest priority, etc.
• Team members may rotate on-off so more (all)
staff are involved over time
Getting ready for validation
• Grade level or department teams develop the
assessments using the Basic Validation Protocol (e.g.,
a gr 2 team might develop a common math
assessment for all gr 2 classes/schools)
• Developers put the assessment on the local
(school/district) validation calendar
• Validation teams prioritize order of validations –
common assessments, major assessments first,
second round review after getting feedback, etc.
Validation Materials
• Each team member needs (electronic) validation
protocols (Handout: Module 3, pages 3-4)
• Each person needs a copy of the cover page with the
assessment and scoring rubric/answer key (Handout:
Module 3, pages 5-6)
• There may be additional materials – e.g., anchor
papers, examples that do not need to copied for
everyone but may be helpful to see during the
review
• Each person needs a content specific DOK reference
sheet (Handout: Module 1, tools #1, #2, or # 3)
Validation Protocols [1]
• Each time, preview norms for working together
– I am…
– I am NOT…
• Choose a recorder – to keep an electronic record &
provide a copy of feedback for the assessment
developers
• Date and list validation panel names on the “official
copy” (this can be set up ahead of time)
• Individually, take 5-10 minutes to read through &
make notes before any discussion
Sample norms (Source: adapted from Powell, WY)
I AM
•
•
•
•
•
•
•
•
•
•
•
Keeping electronic devices on vibrate/off
Listening to understand other points of
view
Respecting everyone as a professional
Focusing on the issues
Avoiding side conversations
Encouraging everyone having a turn to
speak
Refraining from judgmental statements
Representing the best interests of all
students
Asking clarifying questions
Demonstrating a commitment to the
process (attending meetings, on time,
etc.)
Others?
I AM NOT
• Using killer phrases
• Preparing my next remark instead
of listening
• Sounding apologetic
• Engaging in unrelated activities
• Using negative gestures/body
language
• others?
Optional -Validation Protocols [2]
• Should the authors present the task at the
start? (especially if 2nd round) – there are pros
& cons to this
– Go over what is on the cover page/what is included and
what the purpose of the assessment is
– 2-5 minutes to explain the materials in the packet – no
interruptions from validation panel
– Panel then asks any clarifying questions only
– The is NOT for depth of understanding, just to know/clarify
what is there BEFORE silently reading & discussing
Validation Protocols [3]
• Make notes individually before discussion
• Choose a task manager/ timekeeper to keep things
moving – reads each indicator on the Validation
Protocols
• Have a process to reach consensus (fist 5, thumbs
up, etc.)- be sure to involve each person!
• Choose 2 people to give feedback to the
authors/developers & “rehearse” comments
• DEBRIEF! Did we honor norms? What went
well/needs to be refined next time?
Giving Feedback
• Use descriptive language, NOT judgmental
language
• While you may wonder about instructional
pieces, comments/suggestions about
instruction are probably not appropriate
• Your job is NOT to redo the assessment! Keep
feedback crisp & to the point (e.g., pose a
question)- it is the developer’s job to decide
what to do next to strengthen the assessment
tasks.
Giving Feedback (continued)
• Well-written, clear feedback guides
assessment developers to make a stronger
assessment in the end.
• Place your positive (and descriptive)
comments under the feedback section
(Module 3, page 7): What makes this a HQ
(high quality) assessment?
Examples of Feedback (noted on page 7)
1. We were unable to locate…
2. We think this might be DOK2, not DOK3
because…what do you think?
3. We were not clear what the student is expected to
do or to produce. Did you mean…?
4. This might be better aligned to this standard …
5. As hard as it will be, avoid saying “we liked…” This
implies you did not like other things and your job is
NOT to like the assessment.
6. Include the “HQ” positives! The directions are clear;
students have authentic choices; etc.
Debrief each time!
• Did the validation team honor the norms at all
times?
• Do we need to modify/revise norms?
• What went well?
• What could have gone better?
• What will we do differently next time?
• Who/when will we meet with authors to give
feedback?