Transcript Slide 1

Formative Assessment Revisited:
Practical Guidelines for Streamlining Your
System for Data-Based Decision Making
Julie Q. Morrison, Ph.D.
F. Edward Lentz, Ph.D.
Why Revisit
Formative Assessment?
• The past ten years have witnessed an explosion in the
use of formative assessments by school districts across
the country.
• A primary reason for this rapid growth is the assumption
that formative assessments can inform and improve
instructional practice and thereby contribute to
increased student achievement.
•
(Goertz, Olah, & Riggan, 2009)
Focus of this Presentation
• Response-to-Intervention Context
o Data-based decision making
for individual learners
o Data-based decision making at the
systems level
• Streamlining data-based decision
making through differentiating intensity
of intervention and assessment.
Most Formative Assessments
are Not Used Formatively,
Especially at the
Systems-Level!
Back-to-the-Basics:
Summative vs. Formative Assessment
What is
Formative Assessment?
• When we talk about formative assessment, we
are really talking about measuring progress.
• Educators are concern about students who are
shown to be less skilled than others because of
their lower rates of learning, or progress,
through the curriculum.
Progress vs. Performance
100
80
60
40
20
0
Carlos
Performance Standard
Progress is a Change in
Performance Over Time
100
80
60
40
20
0
Carlos
Performance Standard
B.B. King Uses
Formative Assessment!
Estimated Average Glucose (eAG)
330
280
230
180
130
B.B. King
Performance Standard
Formative Assessment in the Classroom
Instructional
Strategy A
Instructional
Strategy B
80
60
40
20
Maria
How Much Data Do We Really
Need to Make Decisions?
• The more frequently we assess, the more often
we can make data-based decisions.
• The frequency of data collection depends of the
severity of the concerns regarding the students’
progress and the intensity of the intervention
efforts.
How Will You Know That Your
Students Are Catching Up?
Data decision rules
inform data collection
and intervention efforts.
Applying Data Decision Rules
Instructional
Strategy A
Instructional
Strategy B
80
60
40
20
Maria
Formative Assessment
at the Systems Level
• Formative assessments at the systems level measure
rate of learning, or progress, to inform instructional and
intervention strategies. The results can be aggregated
and analyzed across classrooms, schools, or even
districts.
Interim Assessments
Quarterly Assessments
Short-cycle Assessments
Teacher-developed Assessments
Common Formative Assessments
Most Formative Assessments
are Not Used Formatively
at the Systems Level!
(Goertz, Olah, & Riggan, 2009; Herman et al., 2006)
A Model for Using Data
to Accelerate Learning
Outcomes Systemwide
Step 1: Select a Valid Screening Measure
(VanDerHeyden, 2010)
To DIBEL or Not to DIBEL?
AIMSweb? CBM?
Screening measures must be:
Matched to performance expectation in the
classroom at that point in the program of
instruction
Of appropriate difficulty to allow accurate
identification of the individual students who
are at particular risk for learning difficulties
relative to their peers.
Key Skills are Generative
• Key skills have been identified that should
emerge at particular stages of instruction to
forecast continued growth toward functional
skill competence for students.
• These skills and their expected time of
development provide benchmarks against
which learning can be evaluated.
A Model for Using Data
to Accelerate Learning
Outcomes Systemwide
Step 2: Specify Comparison Criteria
Measure
Two Approaches to Specifying
Comparison Criteria
1. Use district or school’s data to establish
a criterion that is related to an outcome
that is meaningful to your system.
2. Adopt a performance criterion that has
been reported in the literature.
A Model for Using Data
to Accelerate Learning
Outcomes Systemwide
Step 3: Analyze Screening Data
Multiplication Facts: 0 to 9
Grade 4, Room 26
120
100
80
60
40
20
0
Mean: 38.6
Median: 37
Multiplication Facts: 0 to 9
Grade 4, Room 26
100
90
80
70
60
50
40
30
20
10
0
Mean: 38.6
Median: 37
Multiplication Facts: 0 to 9
Grade 4
120
100
80
60
40
20
0
Mean: 42.9
Median: 37.5
Multiplication Facts: 0 to 9
Grade 4
120
100
80
60
40
20
0
Mean: 42.9
Median: 37.5
Two Steps to Analyzing
Class-Level and Grade-level Data
1. Comparison to external criterion:
The right skill at the right level of performance
2. Compare individual student performance
within the local norm (i.e., class, grade, or
district).
For a Typical Class
or Grade Level …
• The class or grade-level median will fall
above the criterion (Instructional range)
• Comparisons can be made within the
class or grade-level to identify particular
students in need of more intensive
instruction or intervention.
When the Data Indicates a
Class or Grade-Level Concerns …
• The class or grade-level median will fall
below the criterion (Instructional range)
• Additional data are needed:
o Assessment on an easier, prerequisite task
o Classwide intervention data (Tier 1 or Tier 2)
Tier 1 Math Facts Intervention:
Cover, Copy, Compare
100
80
60
40
20
0
Room 26
Mastery Criterion
A Model for Using Data
to Accelerate Learning
Outcomes Systemwide
Step 4: Organize & Present Screening Data
Measure
What Data Should
Be Presented?
• Present grade-level graphs
• Provide median scores for each class
• Present class graphs with individual
student performance
• Be prepared to present data by race/ethnicity,
economically disadvantaged, ELL/LEP status
Facilitating Discussion
About Patterns in the Data
• Areas in which many students are
performing below expectations?
• Are performance problems clustered by
topic area, grade level, or by student
demographics?
• Are there differences between
classrooms at the same grade level?
The intended outcome of this
data-driven discussion
is an action plan!
Possible Targets
for an Action Plan
•
•
•
•
Research-based curriculum
Calendar for instruction/Pacing
Mastery of prerequisite skills
Increased progress monitoring with
feedback to teachers
• Effective instruction
o Student engagement
o Instructional level of materials
o Frequency of student feedback
o Direct instruction of new skills with
feedback/error correction matched to skill
proficiency
o Frequent opportunities to respond/practice
o Contingencies for accuracy and performance
If an analysis of the data
does not indicate a
grade-level or class-wide concern,
then the team should focus on
targeted group and individualized
interventions.
A Model for Using Data
to Accelerate Learning
Outcomes Systemwide
Step 5: Plan for Implementation
Guiding Principles for
Effective Implementation
1. Principal must lead the process
2. Plan must reflect the identified problem
and the priorities of the school
3. The plan developed must be one that
will be effective if properly implemented
Guiding Principles for
Effective Implementation
4. The progress monitoring system has
been developed to measure
implementation and intervention effects
5. A single person has been identified to
manage day-to-day logistics of
implementation
This Model for Data-Based
Decision Making at the Systems
Level Applies Equally Well to
Student Behavior and
Positive School Culture
Concerns
Step 1: Select a Valid
Screening Measure
• Desired Outcomes in Social Behavior
o Predictable, orderly and safe schools
o Social competence
o Social-emotional resilience
Do We Have to Use SWIS?
• School-Wide Information System (SWIS)
• Office Referral Data
o
o
o
o
o
o
o
Student’s name
Name of referring staff member
Problem behavior
Time of day
Location
Possible motivation/function
Administrative decision (Action taken)
Not Recommended
• Discipline data limited to number of
suspensions and expulsions
• Teacher delivery of incentives
Step 2: Specify
Comparison Criteria
Use district or school’s data to establish
a criterion that is related to an outcome
that is meaningful to your system.
Step 3: Analyze
Screening Data
Average Number of Office
Referrals Per Day Per Month
Number of Office Referrals by
Problem Behavior
Number of Office Referrals
By Location
Number of Office Referrals
by Time of Day
Percentage of Total Referrals
by Student Race/Ethnicity
Step 4: Organize and
Present Screening Data
What Data Should
Be Presented?
• Present school-wide and grade-level graphs
• Present class graphs with individual student
performance
• Be prepared to present data by race/ethnicity,
economically disadvantaged, ELL/LEP status,
disability flag
Step 5: Plan Implementation
Facilitating Discussion
About Patterns in the Data
• How does the number of office referrals
in the current year compare to data from
the previous years (month-by-month)?
• Problem behaviors in which many
students are exhibiting?
o Expected competencies students are not
demonstrating?
Facilitating Discussion
About Patterns in the Data
• Are problem behaviors clustered by
location, time of day, time of year?
• Are problem behaviors clustered by
grade level?
• Are there differences between
classrooms at the same grade level?
Big Ideas in
Positive Behavior Support
• Teach students skills to behave appropriately
• Positively acknowledge students engaging in
those behaviors
• Provide consistency and stability in interactions
among students and staff members
If an analysis of the data does not
indicate a systemic concern at the
school-, grade-, or class-level,
then the team should focus on
targeted group and individualized
interventions.
Beware of
Pitfalls
•
Common Pitfall #1
Starting with a favored data system
before considering what
you want to measure
Common Pitfall #2
Data are collected at great expense
(time, materials) but not used
for decision making
Common Pitfall #3
Collecting too much data
Common Pitfall #4
Not using data to inform or improve
instructional and
intervention practices
Common Pitfall #5
Failing to differentiate assessment intensity
(i.e., frequency, multiple measures)
based on the severity of the
learning or behavioral concern
“We're gonna need a bigger spear”
Common Pitfall #6
Data analysis that focuses on
identifying the weaknesses
of teachers and staff to punish them
Common Pitfall #7
Jumping to interventions chosen for
reasons as random as a
recently attended workshop
Six Systems-Level Conditions
Have Been Shown by
Research to Facilitate
Data-driven Decision Making
by Teachers
(Goertz, Olah, & Riggan, 2009)
1. Alignment
 Districts aligned their formative
assessments with content standards and
district curriculum, ensuring that data
generated from the assessments was
relevant to what teachers had been
teaching in the classroom.
2. Expectations for Data Use
 Districts created and communicated
expectations for data use at all levels
of the system.
3. User-friendly Data Systems
 Districts designed user-friendly
electronic data systems that gave
teachers easy ways to analyze student
performance.
4. Professional Development
& Technical Support
 Districts provided professional support in
the use of the formative assessments,
analysis of assessment data, and,
instructional approaches to accelerate
learning.
5. Scheduling Time for
Data-Based Decision Making
 Districts scheduled dedicated time for
teachers to discuss assessment results
and instructional techniques, to re-teach
content and skills to students, and to
participate in professional development.
6. School Leadership Support
 School leaders reinforced expectations for
data use by modeling (conducting their own
analyses) and monitoring (reviewing and
providing feedback) teachers’ use of data,
creating time for teacher collaboration, and
providing direct support to teachers through
modeling instruction.