Transcript Slide 1

1.
What is Progress Monitoring?
2.
Brief review of Universal Screening data, identifying students “in need,” and
how data connects to Instructional Decision-Making and Progress Monitoring
3.
Survey Level Assessment (SLA) & Determining Students’ Current Success
Level (Present level of educational performance)
4.
Setting individualized student goals when progress monitoring—using one
common method involving normative data
5.
Determining the schedule and frequency of monitoring progress:
a. Duration
b. Frequency
6.
Data interpretation, case studies and practice exercises
Summer 2012. Copyright (c) 2012 Pearson Education, Inc. or its affiliate(s). All rights reserved.
All names and data used in this presentation are fictitious.
Today’s Learning Objectives:
2
What is
Progress Monitoring?
Important qualities of tools used to
frequently monitoring academic
progress
Summer 2012. Copyright (c) 2012 Pearson Education, Inc. or its affiliate(s). All rights reserved.
All names and data used in this presentation are fictitious.
1
3
Research-Based Best Practices:
Systematic Formative Evaluation that requires the
use of standardized assessment tools that are:
1. Of similar difficulty
2. Given the same way each time.
(AIMSweb® Offers these features.)
Summer 2012. Copyright (c) 2012 Pearson Education, Inc. or its affiliate(s). All rights reserved.
All names and data used in this presentation are fictitious.
Progress Monitoring Involves:
4
A Brief Review of
Universal Screening Data:
Identifying Students At-Risk for
Academic Failure
*All data and identifying information presented are fictitious.
Summer 2012. Copyright (c) 2012 Pearson Education, Inc. or its affiliate(s). All rights reserved.
All names and data used in this presentation are fictitious.
2
5
*All data and identifying information presented is fictitious.
Summer 2012. Copyright (c) 2012 Pearson Education, Inc. or its affiliate(s). All rights reserved.
All names and data used in this presentation are fictitious.
Michael Martin (fictitious): A student
with Universal Screening data that
indicates he is performing
significantly behind peers and
targets.
6
Box & Whiskers Graphs (Box Plots):
A 3-Step Explanation
1 AIMSweb commonly uses box plots to report data.
3
AIMSweb’s Box plots are
somewhat similar in
shape and representation
as to a vertical bell curve.
2
Michael
Martin
Well Above Average
Above Average
Range
Average range of
Average Range
population included
(middle 50%)
in sample.
Below Average
Range
Well Below Average
*In relation to user-defined comparison group
Above 90th
percentile*
90th percentile*
75th percentile*
Target
Line
Median (50th
percentile) *
25th percentile*
10th percentile*
Below 10th
percentile*
7
Martin, Michael: Grade 5
(All identifying information and scores are fictitious.)
Summer 2012. Copyright (c) 2012 Pearson Education, Inc. or its affiliate(s). All rights reserved.
All names and data used in this presentation are fictitious.
Fall Benchmark Data for
Michael Martin
Grade 5:
Grade 5:
Michael’s School
Michael’s
District
Grade 5: National
Aggregate Norms
Summer 2012. Copyright (c) 2012 Pearson Education, Inc. or its affiliate(s). All rights reserved.
All names and data used in this presentation are fictitious.
Compare Performance Across Groups
9
*All data and identifying information presented are fictitious.
Compare Michael Martin
Fall 5th grade student:
48 wrc / 12 errors
Summer 2012. Copyright (c) 2012 Pearson Education, Inc. or its affiliate(s). All rights reserved.
All names and data used in this presentation are fictitious.
AIMSweb National Norms
Michael Martin
10
Survey Level Assessment (SLA)
Grade 3:
76/8
*All data and identifying information presented are fictitious.
Grade 4:
67/10
Grade 5:
48/12
Summer 2012. Copyright (c) 2012 Pearson Education, Inc. or its affiliate(s). All rights reserved.
All names and data used in this presentation are fictitious.
3
11
Grade 5
Median:
48/12
SLA: Students are
tested in successive
levels of general
curriculum,
beginning with their
current expected
grade placement, until
a level at which they
are successful is
determined.
Grade 4
Median:
67/10
Grade 3
Median:
76/8
Summer 2012. Copyright (c) 2012 Pearson Education, Inc. or its affiliate(s). All rights reserved.
All names and data used in this presentation are fictitious.
Survey Level Assessment
12
Uses National Normative Data
Reading Curriculum Based Measurement (R-CBM)
*All data and identifying information presented are fictitious.
Summer 2012. Copyright (c) 2012 Pearson Education, Inc. or its affiliate(s). All rights reserved.
All names and data used in this presentation are fictitious.
Survey Level Assessment (SLA):
13
Grade 5:
48/12
*All data and identifying information presented are fictitious.
Summer 2012. Copyright (c) 2012 Pearson Education, Inc. or its affiliate(s). All rights reserved.
All names and data used in this presentation are fictitious.
Survey Level Assessment (SLA):
14
Grade 4:
67/10
Grade 5:
48/12
*All data and identifying information presented are fictitious.
Summer 2012. Copyright (c) 2012 Pearson Education, Inc. or its affiliate(s). All rights reserved.
All names and data used in this presentation are fictitious.
Survey Level Assessment (SLA):
15
Grade 3:
76/8
Grade 4:
67/10
Grade 5:
48/12
*All data and identifying information presented are fictitious.
Summer 2012. Copyright (c) 2012 Pearson Education, Inc. or its affiliate(s). All rights reserved.
All names and data used in this presentation are fictitious.
Survey Level Assessment (SLA):
16
“Michael Martin currently reads about 48 words correctly, with 12 errors, from
Grade 5 Standard Reading Assessment Passages.
He reads Grade 3 reading passages successfully; 76 words correct per minute,
with 8 errors, which is how well average beginning 3rd grade students read this
material.”
Grade 3:
76/8
Grade 4:
Grade 5:
67/10
48/12
*All data and identifying information presented are fictitious.
Summer 2012. Copyright (c) 2012 Pearson Education, Inc. or its affiliate(s). All rights reserved.
All names and data used in this presentation are fictitious.
Using SLA Data for describing
Present Levels of
Educational Performance
17
Setting Individualized
Student Goals:
The Principles & The Practice
The Practice
*All data and identifying information presented are fictitious.
Summer 2012. Copyright (c) 2012 Pearson Education, Inc. or its affiliate(s). All rights reserved.
All names and data used in this presentation are fictitious.
4
18
There are multiple ways in which to set performance goals for students
when progress monitoring.
Those include, but are not limited to:
• Using school, district, or state pre-approved targets
• Using a cut score that predicts likelihood of passing high stakes tests
•
Using desired Rate of Improvement (ROI) in relation to the period of
time by which the student is expected to reach the goal
•
Reducing the “achievement gap” by using normative data as a
reference (local or national norms)
Summer 2012. Copyright (c) 2012 Pearson Education, Inc. or its affiliate(s). All rights reserved.
All names and data used in this presentation are fictitious.
Methods for Setting Goals Using
General Outcome Measures &
Frequent Progress Monitoring
19
Using Normative Data
Summer 2012. Copyright (c) 2012 Pearson Education, Inc. or its affiliate(s). All rights reserved.
All names and data used in this presentation are fictitious.
A Common Method for
Goal Setting with Frequent Progress
Monitoring:
20
50% of students in
5th grade at this
school are
performing between
90-128 wrc/min.
These students
represent the
MIDDLE 50% of
students in the
comparison group
(school).
Summer 2012. Copyright (c) 2012 Pearson Education, Inc. or its affiliate(s). All rights reserved.
All names and data used in this presentation are fictitious.
Michael Martin
21
“Core” Curriculum:
“Core” instruction is often
delivered in a way that
meets “middle” students
more than the students in
the “whiskers.”
Implication? BM data
reflects current status of
performance—and also
where the “core” must
work to move middle
students higher by next
benchmark period.
Summer 2012. Copyright (c) 2012 Pearson Education, Inc. or its affiliate(s). All rights reserved.
All names and data used in this presentation are fictitious.
Michael Martin
22
Grade 3:
76/8
Grade 4:
67/10
Grade 5:
48/12
*All data and identifying information presented are fictitious.
Summer 2011. Copyright (c) 2011Pearson Education, Inc. or its affiliate(s). All rights reserved.
All names and data used in this presentation are fictitious.
Michael Martin
23
Sample: 36-week expectation for performance = GOAL
Reading Curriculum Based Measurement (R-CBM)
Goal for Michael is set at about
the 25th percentile (spring),
98wrc/min, rounded to
(Norm-referenced goal setting method)
*All data and identifying information presented are fictitious.
Summer 2011. Copyright (c) 2011Pearson Education, Inc. or its affiliate(s). All rights reserved.
All names and data used in this presentation are fictitious.
Michael Martin
24
Goal for Michael is set at about
the 25th percentile (spring),
98wrc/min, rounded to
(Norm-referenced goal setting method)
Determining the Schedule &
Frequency for Monitoring Progress
Duration: How long?
Frequency: How often?
*All data and identifying information presented are fictitious.
Summer 2011. Copyright (c) 2011Pearson Education, Inc. or its affiliate(s). All rights reserved.
All names and data used in this presentation are fictitious.
5
26
Making Data-Based Decisions With Progress Monitor
Typically need at least 7-10 data points (Shinn &
Good, 1989) before making programming decision—
and you may need to collect more if uncertain.
 Christ & Silberglitt (2007) recommended 6-9 data
points
 As the number of data points increases, the effects of
measurement error on the trend line decreases.
Exception- 3-4 consecutive data points below aimline
Summer 2011. Copyright (c) 2011Pearson Education, Inc. or its affiliate(s). All rights reserved.
All names and data used in this presentation are fictitious.
How much data should be collected?
27
Understanding Elements of an AIMSweb
Progress Monitor Graph
AIMSweb Progress
Monitor provides the
new ROI after the
entry of three (3) data
points.
28
Four Criteria To Consider:
Criteria #1. Trend line meets (or is on-target to meet) AIM line for
ultimate goal:
Success! Once goal is met, consider transition to less intensive
program or new goal as needed.
Summer 2011. Copyright (c) 2011Pearson Education, Inc. or its affiliate(s). All rights reserved.
All names and data used in this presentation are fictitious.
How much data should be collected?
29
Criteria #2. Trend line and AIM line will intersect in relatively near
future?
Keep with current intervention until goal is reached.
*All data and identifying information presented are fictitious.
Summer 2011. Copyright (c) 2011Pearson Education, Inc. or its affiliate(s). All rights reserved.
All names and data used in this presentation are fictitious.
How much data should be collected?
30
Criteria #3a. Trend line exceeds AIM line?
a. Consider increasing goal or difficulty level
Grade 5 student reading
grade 4 passages. Goal was
changed from 104 wrc/min to
125 wrc/min.
NOTE: When changing a goal to require
a different grade level of material, start a
new schedule.
Do not use the same schedule as the
data are not comparable (i.e., 50 wrc/min
on a 5th grade passage means something
different than 50 wrc/min on a 3rd grade
passage.)
*All data and identifying information presented are fictitious.
Summer 2011. Copyright (c) 2011Pearson Education, Inc. or its affiliate(s). All rights reserved.
All names and data used in this presentation are fictitious.
How much data should be collected?
31
Criteria #3b. Trend line exceeds AIM line?
b. Or, retain the current intervention and close the gap even faster
if this goal is the final performance level the student is to reach
while being progress monitored.
Student may reach goal in midMarch, rather than the end of
May if progress continues at
same rate of improvement.
Grade K
student on
Grade K PSF
probes.
Summer 2011. Copyright (c) 2011Pearson Education, Inc. or its affiliate(s). All rights reserved.
All names and data used in this presentation are fictitious.
How much data should be collected?
32
*All data and identifying information presented are fictitious.
Criteria #4. Trend line will not likely intersect AIM line—and/or
moves in opposite direction of AIM line:
Consider adding additional intervention, changing variable, and/or
intensifying program changes.
Note four data
points are
already below
the AIM line.
*All data and identifying information presented are fictitious.
Summer 2011. Copyright (c) 2011Pearson Education, Inc. or its affiliate(s). All rights reserved.
All names and data used in this presentation are fictitious.
How much data should be collected?
33
Variability of the data:
a.
The “more variable” the data,
the larger the error in the
slope.
The larger the error in the
slope, the more data points
are needed to gain
confidence in the trend/actual
progress made.
b.
The "tighter" the data, the
fewer the number of data
points potentially needed to
be “confident” in the trend
developing.
Summer 2011. Copyright (c) 2011Pearson Education, Inc. or its affiliate(s). All rights reserved.
All names and data used in this presentation are fictitious.
Building Confidence in
Decision-Making
34
The direction of the trend:
a.
If all the data points are
below the aimline and going
strongly negative, you will not
likely need 7-10 data points
to confirm
"uh-oh!"
b.
In contrast, if all data points
are above the line and in
strongly positive direction, the
opposite applies—you won’t
likely need 10 data points to
say, "wow" and increase the
ambitiousness of your goal.
Summer 2011. Copyright (c) 2011Pearson Education, Inc. or its affiliate(s). All rights reserved.
All names and data used in this presentation are fictitious.
Building Confidence in
Decision-Making
35
Balance IDEAL with FEASIBLE:
Too little data, too infrequently means students may
stay in ineffective programs longer than necessary.
See example on next slide.
*All data and identifying information presented are fictitious.
Summer 2011. Copyright (c) 2011Pearson Education, Inc. or its affiliate(s). All rights reserved.
All names and data used in this presentation are fictitious.
How Frequently to Assess?
36
Note that a student may potentially be in an ineffective program longer than needed when data
collection is not done frequently enough.
5 data points over 15 weeks.
vs.
5 data points over 5 weeks.
*All data and identifying information presented are fictitious.
Summer 2011. Copyright (c) 2011Pearson Education, Inc. or its affiliate(s). All rights reserved.
All names and data used in this presentation are fictitious.
Decision-Making
37
Summer 2011. Copyright (c) 2011Pearson Education, Inc. or its affiliate(s). All rights reserved.
All names and data used in this presentation are fictitious.
Frequency of Assessment Directly
Related to Student Achievement
Similar results found by Fuchs & Fuchs (1986)
38
Summer 2011. Copyright (c) 2011Pearson Education, Inc. or its affiliate(s). All rights reserved.
All names and data used in this presentation are fictitious.
Now that we have calculated a goal using
the norm-referenced method, let’s
practice goal setting using the weekly
ROI-method…
39
Use ROI
method to
calculate goal
3. Write a Rate of Improvement goal for Maya, assuming 36 weeks until her
annual review.
In ___ weeks, Maya will read ___ words correctly, with ____ or fewer errors,
from Grade ___ Progress Monitor Reading Assessment Passages.
.
*All data and identifying information presented are fictitious.
Summer 2011. Copyright (c) 2011Pearson Education, Inc. or its affiliate(s). All rights reserved.
All names and data used in this presentation are fictitious.
Reading Curriculum Based Measurement (R-CBM)
40
Well Below Average
Below Average
Average
Well Above Average
Use the following chart as
needed to determine
Maya’s goal using the ROI
method.
Summer 2011. Copyright (c) 2011Pearson Education, Inc. or its affiliate(s). All rights reserved.
All names and data used in this presentation are fictitious.
Well Below Average
41
Goal Setting:
ROI Method
Well Below Average
Below Average
Average
Well Above Average
• Double ROI on Norm
chart to set
ambitious goal:
.78 x 2 = 1.6
• Multiply by # weeks
to get gain:
1.6 x 36 wks = 57.6
• Add gain to baseline
score:
39 (baseline) + 58 = 97
Summer 2011. Copyright (c) 2011Pearson Education, Inc. or its affiliate(s). All rights reserved.
All names and data used in this presentation are fictitious.
Well Below Average
ROI GOAL: 97 wrc/min
42
Goal Setting:
ROI Method
POSSIBLE ANSWER:
3. Write a Rate of Improvement goal for Maya, assuming 36 weeks until her
annual review.
In 36 weeks, Maya will read 97 words correctly, with 4 or fewer errors, from
Grade 4 Progress Monitor Reading Assessment Passages.
.
* Dependent upon ambitious, but feasible goal for individual student.
*All data and identifying information presented are fictitious.
Summer 2011. Copyright (c) 2011Pearson Education, Inc. or its affiliate(s). All rights reserved.
All names and data used in this presentation are fictitious.
Reading Curriculum Based Measurement (R-CBM)
43
Additional Case Studies &
Practice Exercises
Zachary Johnston: Grade 6 student
M-CAP Goal Setting
Schedule Setup
Progress Monitoring
*All data and identifying information presented are fictitious.
Summer 2011. Copyright (c) 2011Pearson Education, Inc. or its affiliate(s). All rights reserved.
All names and data used in this presentation are fictitious.
8a
44
DIRECTIONS:
Task 1:
Using the graph below,
determine Zachary’s
performance level for
each median Survey
Level Assessment
score obtained (based
on Fall Norms).
Task 2:
Write down Zachary’s
performance level
(e.g., “Average,”
“Below Average,” etc.)
in the table above.
*All data and identifying information presented are fictitious.
Math Concepts and Applications
Summer 2011. Copyright (c) 2011Pearson Education, Inc. or its affiliate(s). All rights reserved.
All names and data used in this presentation are fictitious.
For this case,
assume that . . .
45
DIRECTIONS:
Task 1:
Using the graph below,
determine Zachary’s
performance level for
each median Survey
Level Assessment
score obtained (based
on Fall Norms).
Task 2:
Write down Zachary’s
performance level
(e.g., “Average,”
“Below Average,” etc.)
in the table above.
*All data and identifying information presented are fictitious.
Well Below Average
Below Average
Below Average
Average
Math Concepts and Applications
Summer 2011. Copyright (c) 2011Pearson Education, Inc. or its affiliate(s). All rights reserved.
All names and data used in this presentation are fictitious.
For this case,
assume that . . .
46
Directions:
1. Complete
the activity
below.
Consider Zachary’s scores and check the one that makes the most sense:
Zachary has a moderate performance discrepancy where the goal material can be at his
current 6th grade level
• Zachary’s baseline score for his PM graph will be 4 based on his 6th grade MCAP
SLA
Zachary has such a severe performance discrepancy that lower grade material should be
considered as his goal material for the next 36 weeks
• Zachary’s baseline MCAP score will be 4 based on his 5th grade MCAP SLA
Summer 2011. Copyright (c) 2011Pearson Education, Inc. or its affiliate(s). All rights reserved.
All names and data used in this presentation are fictitious.
Math Concepts and Applications
47
Directions:
1. Complete
the activity
below.
Consider Zachary’s scores and check the one that makes the most sense:
Zachary has a moderate performance discrepancy where the goal material can be at his
current 6th grade level
• Zachary’s baseline score for his PM graph will be 4 based on his 6th grade SLA
Zachary has such a severe performance discrepancy that lower grade material should be
considered as his goal material for the next 36 weeks
• Zachary’s baseline MCAP score will be 4 based on his 5th grade SLA
Summer 2011. Copyright (c) 2011Pearson Education, Inc. or its affiliate(s). All rights reserved.
All names and data used in this presentation are fictitious.
Math Concepts and Applications
48
Currently, Zachary earns ____ correct points in 8 minutes on Grade 6 AIMSweb®
MCAP probes. He performs successfully on Grade ____ MCAP probes, earning
____ correct points in 8 minutes, which is approximately how well ____ grade
students perform math computation problems in the fall of the year.
*All data and identifying information presented are fictitious.
Summer 2011. Copyright (c) 2011Pearson Education, Inc. or its affiliate(s). All rights reserved.
All names and data used in this presentation are fictitious.
Write a Present Levels of Educational Performance statement for Zachary:
49
Currently, Zachary earns 4 correct points in 8 minutes on Grade 6 AIMSweb®
MCAP probes. He performs successfully on Grade 3 MCAP probes, earning 9
correct points in 8 minutes, which is approximately how well 3rd grade students
perform math computation problems in the fall of the year.
*All data and identifying information presented are fictitious.
Summer 2011. Copyright (c) 2011Pearson Education, Inc. or its affiliate(s). All rights reserved.
All names and data used in this presentation are fictitious.
Write a Present Levels of Educational Performance statement for Zachary:
50
3. Write a norm-referenced goal for Zachary, assuming 36 weeks until his
annual review.
In ___ weeks, Zachary will earn ___ points correct on Grade ___ Progress
Monitor MCAP probes.
*All data and identifying information presented are fictitious.
Summer 2011. Copyright (c) 2011Pearson Education, Inc. or its affiliate(s). All rights reserved.
All names and data used in this presentation are fictitious.
Zachary’s Goal:
Norm Referenced
Method
51
POSSIBLE ANSWER:
3. Write a norm-referenced goal for Zachary, assuming 36 weeks until his
annual review.
In 36 weeks, Zachary will earn 13 points correct on Grade 6 Progress Monitor
MCAP probes.
* Dependent upon ambitious, but feasible goal for individual student.
*All data and identifying information presented are fictitious.
Summer 2011. Copyright (c) 2011Pearson Education, Inc. or its affiliate(s). All rights reserved.
All names and data used in this presentation are fictitious.
Zachary’s Goal:
Norm Referenced
Method
52