Transcript Slide 1

Evaluation Results
2002-2008
MRI’s Evaluation Activities:
• Surveys
Teacher Beliefs and Practices
(pre/post)
Annual Participant Questionnaire
• Data Collection
Test Scores
Standardized Tests
Classroom Assessments (DRA)
MAP
Demographics
Special Education Information
• MAP Analyses
MAP ANALYSES:
Map analyses compare schools that
have finished the MRI program with a
randomly chosen sample of non-MRI
elementary schools
Results indicate MRI schools generally
outperform non-MRI schools
(Not proof of a causal relationship)
Notes for MAP Analyses
2002-2005
•Note: With the following MAP Analyses 2002-2005 charts the
numbers are not as important as the comparative performance
between MRI and non-MRI schools. This is because:
1.There is variation in the scores from year to year and school to
school.
2.The calculation of the baseline changes as more data becomes
available.
The longer baselines mean there is less variation resulting in
“flatter” or lower results.
–For 2002 schools 1999 was the baseline
–For 2003 schools an average of 1999/2000 was the baseline
–For 2004 schools an average of 1999/2001 was the baseline
–For 2005 schools an average of 2000/2002 was the baseline
2002-2005 Comparison of MRI and Random Samples
Average % Change
in Communication Arts Index per School
8
7
6.5
Red = Random Samples
Black = M RI
6
4.8
n=15
5
3.9
4
n=20
3.2
n=27
3
n=15
1.8
1.9
2
1
n=150
0.9
n=270
n=200
0.8
n=150
0
2002
2003
2004
2005
MAP Results
In 2006 the MAP Communication Arts test
was changed in ways that make
comparisons to previous years difficult:
•Achievement levels were reduced from
five to four
•Scaled Score intervals for categories
were changed
•Questions were adjusted to apply to
multiple grade levels that were tested
(Grades 3-8 instead of only 3 and 7)
MAP Results: 2006-2007
For 2006-2007 the comparison between MRI and the
random sample of Missouri elementary schools was made
in terms of the percentage change between a 3 Year
Baseline and the outcome year of students who scored in
the top two achievement levels (Proficient and Advanced)
•2006: Baseline=2002-2004
•2007: Baseline=2003-2005
•In 2006 This was done for 1st and 2nd year K-3 MRI
schools (n=20) because there was only one 3rd
year graduating school in 2006
•In 2007 the analysis was done for 3rd Year schools only
(n=17) for all grades 3-8
MAP Comparisons 2006-2007
MRI and Random Sample
(With 3 year baselines)
100%
90%
80%
70%
60%
50%
47.0%
40%
28.2%
30%
30.7%
19.7%
20%
10%
0%
2006
MRI n=20
Random Sample n=137
2007
MRI n=17
Random Sample n=131
2006-2008 MAP Analysis
• In 2008 we now have three years of data after the
MAP test was revised in 2006.
• In this analysis we compare the results of MRI, two
other Missouri Professional Development programs
(Programs I and II), and a Random Sample (RS) of
Missouri Elementary Schools
• The Outcome Measure is the same as used by
Federal and State programs in determining
Adequate Yearly Progress or AYP: The percentage
of students scoring at or above Proficiency
Steps in the 2006-2008 MAP Analysis
• Step 1: Get percentages of students Proficient or
Advanced (Prof+) for each school 2006-2008
Source:://dese.mo.gov/schooldata/school_data.html AYP Reports
• Step 2: Calculate a baseline of the average Prof+ of
2006-2007.
• Step 3: Calculate the change (∆) in Prof+ in 2008 from
the baseline for each school
• Step 4: Calculate average and median ∆ for
each group: (MRI, I, II, and Random Sample)
• Step 5: Calculate standard deviations, skew, and prebaseline average for each group
• Step 6: Remove all schools from each group whose ∆
was >2*SD
• Step 7: Repeat Steps 1-5
MRI
II
III
RS
number
49
115
95
125
Avg ∆
5.7%
-0.5%
-0.3%
1.1%
Med ∆
3.7%
-2.0%
-3.3%
0.8%
Baseline = (2006+2007)/2
SD Pre % Prof+
0.159
38.9%
0.206
36.3%
0.249
29.7%
0.133
46.3%
∆=(2008-Baseline)/Baseline
The data in this Table supports the statement that between 2006 and
2008 MRI schools made larger gains on average in the percentage of
students scoring at Proficient or Better on the 2008 MAP
Communication Arts test than two other Missouri professional
development programs and a random sample of Missouri elementary
schools.
Samples in Table are from three different professional
development programs and a random sample of schools in
Missouri. The samples have been adjusted by removing
“outliers” beyond +/- 2 Standard Deviations. Complete
analysis, including supporting data, is available from MRI
Research and Assessment.
Adequate Yearly Progress
• As mandated by federal law, Missouri schools must
make yearly progress goals in MAP scores
• For Communication Arts those goals were defined as
the percentage of students scoring at Proficient or
better
2003 - 19.4%
2004 - 20.4%
2005 - 26.6%
2006 - 34.7%
2007 - 42.9%
2008 - 51.0%
The following Table provides a comparison between
MRI schools and state-wide results.
Percentage of Schools Meeting AYP Levels1
2003=19.4% 2004=20.4% 2005=26.6% 2006=34.7% 2007=42.9% 2008=51.0%
Proficient and Advanced
Year
MRI
State
2003
81%
(60 / 74)
50.9%
(1,0469/2,053)
2004
100%
(50 / 50)
77.27%
(1,569/2,033)
2005
80%
(28 /35)
64.7%
(1,317/2,036)
20062
78.5%
(22/27)
62.6%
(1,291/2,061)
2007
81.5%
(17/21)
53.6%
(1,125/2,100)
2008
68.3%
(28/41)
+/-40%3
(+/- 881/2,203)
Includes “Safe Harbor” and “Confidence Interval” results
in 2006 AYP was calculated for grades 3-8 and 11
3 In 2008 DESE reported the results for all schools as follows: “Only one-fourth of all school districts and about 40
percent of school buildings met this year’s proficiency targets for adequate yearly progress (AYP). “ The Title I ratio
was more specific : 44.8% met AYP in 2008
1
2 Beginning
(See- http://www.dese.mo.gov/news/2008/MAPandAYP.htm)
Teaching and Learning Survey
In this survey classroom teachers were asked to
identify instructional practices and frequencies of
use (using a scale of 1=Never to 5=Almost Daily)
of a number of critical elements related to the
goals of MRI training.
One way of looking at the data is by identifying
those practices that were not frequently utilized by
“pre” respondents (less than “3”), and ask if there
were any changes reflected in the “post”
responses.
Teaching and Learning Survey Items:
K-3 “pre” (2005) Mean <3
• 7: Assesses reading progress by use of informal
assessments (running records, CAP, DRA,
letter identification, etc.)
• 8: Implements reading workshop
• 11: Writes a text collaboratively with students
sharing the pen
• 15: Collects student writing samples to document writing
progress over time
• 16: Uses scoring guides/rubrics to assess student
writing
• 17: Implements writing workshop
• 20: Organizes literacy corners to provide independent
practice for students
• 21: Provides opportunities for students to use
computers to write, publish, and practice
K-3 Practice Changes
2005-2008
3rd Year Respondents (n=170) K-3 Practice Changes: 2005-2008
A7
A8
A11
A15
A16
A17
A20
A21
2005
2.8
3.0
3.0
3.0
2.5
2.8
3.0
2.1
2008
3.7
4.4
3.5
3.3
2.9
4.2
3.9
2.5
In most of these categories there has been significant change of selfreported implementation of critical practices. Unlike previous years,
however, three of the practices (A11, A15 and A16), do not show the same
kind of robust growth as in the past. An early analysis of the data suggests
this result may be because of a relatively high number of kindergarten
teachers in the sample, teachers who might be less likely to use writing
strategies than in other grades.
Teaching and Learning Survey Items:
Upper Grades “pre” (2005) Mean <3
• 7:
• 8:
• 12:
• 13:
• 15:
• 18:
Assesses reading progress by use of
informal assessments (running records,
CAP, DRA, letter identification, etc.)
Implements reading workshop
Conferences with students individually to
discuss their writing progress
Collects student writing samples to
document writing progress over time
Implements writing workshop
Provides opportunities for students to use
computers to write, publish, and practice
4-8 Practice Changes
2005-2008
Upper Grade Respondents (n=116)
A7
A8
A12
A13
A15
A18
2005
2.8
3.0
3.0
3.0
2.6
2.7
2008
3.2
4.2
3.5
2.9
3.7
3.1
The evidence presented here supports the statement that while there were
practice changes, the strength of the variation is less than that which was
observed for the K-3 school. Indeed, in one case (A13) no change in
component usage was reported, and bears closer scrutiny. The differences
in intensity between K-3 and Upper Grade teaching cohorts are likely a
result of the fact that the upper grades are more departmentalized with
more content area teachers whose primary responsibilities are in subject
areas other than literacy. In addition, as noted in previous reports, upper
grade teachers are more likely to use technology as an instructional tool
(A18).
2008 Participant Survey
• Participants rate the usefulness of component
utilization, practice change, "buy in", attitudes
toward the program and trainer, etc.
• Results drive program change;
e.g.; Program Orientation
Upper Grade Program
Please see the “2007 Survey Results” Power Point presentation at
http://missourireadinginitiative.com/program_evaluation.php
for more detailed results of the Participant Survey between 2002
and 2007.
Participant Survey
There are two positive trends reflected in
the MRI End of the Year Participant
Questionnaire:
• (1) Participants rate the program higher
with passage of time; and
• (2) each year sees the entry level of
satisfaction rise for new cohorts.
The following tables demonstrate
these trends between 2002 and 2008
Participant Survey
“Rate” by MRI Program Year 2002-2004
Reflecting on the effectiveness
of the MRI program as a whole,
how would you rate it?
Poor
Excellent
1 2 3 4 5
MRI
Cohort
1st Year
2002
(N=733)
3.8
2003
(N=956)
3.9
2004
(N=770)
4.2
2nd Year
4.1
4.1
4
3rd Year
*
4.4
4.3
•*3rd Year schools were interviewed in 2002
We have found that ratings generally go up from year to
year as participants become more familiar with the
program and, more importantly, begin to see the tangible
results of improved student reading in their classrooms.
Participant Survey
“Rate” by MRI Program Year 2005-2008
MRI
Cohort
2005
2006
2007
2008
(N=642)
(N=617)
(N=489)
(N=684)
1st Year
K-3
4.2
4-6
3.6
K-3
4.2
4-6
3.7
K-3
4.4
4-6
4.6
K-3
3.8
4-6*
3.2
2nd Year
4.2
na
4.1
3.7
4.3
4.2
4.1
3.8
4.4
na
4.3
na
4.2
3.8
4.2
3.7
rd
3 Year
Beginning in 2005 MRI began expanding to higher grades
which have different dynamics and different scoring
tendencies. Briefly, because the upper grades are
increasingly departmentalized, content area teachers are
usually more resistant to literacy professional development
when compared to communication arts specialists.
Over time, however, upper grade scores improved to K-3
levels as MRI Trainers responded to participants’ concerns
and adapted the program to upper grade teachers’ needs.
*In 2008 1st year and 4-6 scores were depressed by an “outlier” district where
four participating schools had unusually low scores. MRI staff will use this
information to address whatever implementation issues there are and, as a
consequence, we would expect to see the scores rebound in 2008-2009.
DRA Results:
The Developmental Reading Assessment tool (DRA) is
a formalized classroom assessment that has proven to be
an accurate indicator of a student’s actual reading level. This
is a key element of the MRI program as “assessment drives
instruction” and allows teachers to be highly specific in
responding to each individual student’s needs.
The following slide presents information about the
changes in the percentages of students reading “At or
Above” Grade Level at 2nd and 3rd MRI year schools for
which the DRA data has been reported and analyzed as of
9/30/2008.
The results are organized by grade level cohorts; that is,
students who are in the same class as they move up grade
levels.
ALL reporting cohorts show significant increases in
students reading “At or Above” Grade Level as
measured by the DRA.
DRA Change in Percentage of Students Reading
“At or Above” Grade Level
(“F”=Fall; “S”=Spring)
School
Grade Cohort
Pre-date
Post-date
Pre%
Post%
% Change
1
12
S07
S08
82.1
87.7
6.8
2
12
S07
S08
51.9
76.6
47.6
3
13
S06
S08
42.3
64.2
51.8
4
13
S06
S08
52.2
87.8
68.2
5
13
S06
S08
52
84.8
63.1
6
13
S06
S08
73.9
91.1
23.3
7
13
S06
S08
72.7
80.2
11.1
8
13
F06
S08
7.4
66.7
801.4
9
13
F05
S08
20.4
66.7
226.9
10
13
F05
S08
20.3
43.4
113.8
11
23
F06
S08
47.2
62
31.4
12
45
F06
S08
35.7
75.4
111.2
13
46
F05
S08
38.1
88.7
132.8
14
46
F05
S08
55.7
87.2
56.6
15
46
F05
S08
60.9
81.5
33.8
16
46
F05
S08
8.7
61.5
606.9
17
46
F05
S08
13.9
70
403.6
Average for All reported schools
164.1