Transcript Slide 1

Will That Work for Us?
Interpreting Research from
The Memphis Striving Readers Project
(MSRP)
Presented by
Ric Potts, MCS; J. Helen Perkins, U of M; Elizabeth Heeren, MCS;
Rorie Harris, MCS; and Jill Feldman, RBS
2008 International Reading Association Research Conference
Atlanta, GA
Session Overview
•
•
•
•
Introduction to the Striving Reader’s grant
Overview of Memphis SR research design
Year One Impact Analyses
Collection of implementation fidelity data
– implications for practitioners and researchers
• Planned (Ongoing) Analyses
• Q & A /Group Discussion
Introduction: Memphis Striving
Readers Project (MSRP)
Ric Potts, PI – MSRP
Memphis City Public Schools
Memphis-The City
The City of Memphis has a population of 642,251.
63.1% African American
31.3% Caucasian
4.1% Hispanic
And one Elvis
Approximately 70 percent of adolescents
struggle to read. The young people enrolled in
middle and high school who lack the broad
literacy skills to comprehend and learn advanced
academic subjects will suffer serious social,
emotional, and economic consequences.
» Reading at Risk: The State Response to the Crisis in
Adolescent Literacy, Oct. 2005
Urban Child Institute
The State of Children in
Memphis and Shelby County
2006
“Under-educated children have no future.”
Urban Child Institute
The State of Children in
Memphis and Shelby County
2006
• by U.S. standards roughly 75 percent of
students in Tennessee fail to meet
national grade appropriate standards,
and Memphis is at the bottom in
Tennessee. . . . Memphis is one of the
least-educated cities in America.
Motivation behind
Memphis Striving Readers Project
•
Memphis is among the poorest and least-educated cities in the US
–
–
–
–
–
•
30.1% of all children live in poverty
24.3% of adults have less than a HS education
36.7% have HS diploma or equivalent
30.5% have Assoc. or some college
8.5% have at least a BA
MCS is 21st largest K12 district in US >116,000 students
–
–
–
–
–
–
Over 95% of MCS’ 196 schools are Title I schools
71% of MCS students qualify for free/reduced price lunch
MCS students are 87% AA; 9% White; 4% “other”
In 85% of MCS schools, 33% of students change schools during year
In 2003-04, the system-wide graduation rate was 61 percent
71% of students in grades 6-8 scored below the 50th percentile on TCAP (Reading/Language
Arts)
Striving Readers – A Federal
Response
• In 2005, the Department of Education
called for proposals for the Striving
Readers grant.
• In March, 2006, Memphis was one of eight
sites awarded the grant.
Memphis Striving Reader Program
Targeted Schools
School
Grade
Span
Total Enrollment
Total # Of Non-Special Education
Students Scoring In Bottom Quartile
In Reading
School 2
6-8
1,021
414
School 1
6-8
1,033
384
School 6
6-8
700
251
School 5
6-8
765
245
School 8
6-8
547
178
School 4
6-8
486
196
School 3
6-8
976
357
School 7
6-8
877
274
The Whole School Intervention:
Memphis Content Literacy Academy
(MCLA)
Overview presented by
J. Helen Perkins, SR Co-PI
University of Memphis
A Change Model
A Capacity-Building Model for
Teacher Development
Expertise
& Ability
to Coach
Others
(Cooter & Cooter, 2003)
Refined and
Expanded
Capacity
Practice with
Coaching
Deeper Learning with
Limited Capacity
First
Exposure
No
Knowledge
Emphasis: “Deep Training”
(180 hours over two years) …
Memphis Content Literacy Academy
Infusing Simultaneously Across Core Subject Areas
Scientifically-based Reading Research (SBRR)
Strategies in…
Vocabulary
Reading Comprehension
Reading Fluency
Benefits to Teacher –
“Laureates”…
• Advanced Training (180 hours) on scientifically-based reading
instruction (SBRR) for urban children
• A Master Teacher “Coach” to Assist (30 hours) with
Implementing New Strategies (in their own classrooms!)
• Twelve (12) Graduate Semester Hours of Credit from
University of Memphis (FREE) (applicable to an advanced
degree)
• Can Seek “Highly Qualified” Endorsement in Reading
• Books and Materials (FREE)
• Success in Helping Children Achieve “AYP”
• Principal Support
MCLA Year 1: Selected Strategies
Fluency
• Choral Reading
• Paired reading
• Guided, repeated,
oral reading (pairs)
Comprehension
• Question Generation
• Three- Level Retelling
•Oral
•Graphic Organizor
•Written
• Comprehension monitoring
• Expository Text Patterns
• Multiple Strategies
Vocabulary Development
• Pre-instruction of vocabulary
• Repeated, multiple exposures
• Semantic Maps
Classroom Organizational Tools &
Strategies: Year 1
• CREDE Standards
• Whole class v.
collaborative small
group
• Reading Next
Elements
• Use of leveled materials
(e.g., National Geographic)
http://crede.berkeley.edu/standards/standards.html
CREDE Formatting of
Professional Development
Training
Classroom Action Plans
(CAPs)
Spring 2008
Science, Social Studies, & ELA
Your task is to develop a series of class lessons where
you teach academic vocabulary in a unit of your choice.
You must have at least one vocabulary learning
strategy/activity that occurs:
1. BEFORE students read the assigned text,
2. DURING the reading assignment, and
3. AFTER the reading assignment
MCLA Classroom Model
•
Gradual release of responsibility
(teacher modeling, guided practice, independent practice, independent use)
•
Integration of 12 literacy strategies
(vocabulary, fluency & comprehension)
•
Development of Classroom Action Plans (CAPs)
(content area lesson plans for strategy implementation including procedures for
student assessment)
•
On-site support provided by coaches
•
Use of Curriculum Resource Center (CRC) materials
The Principals’ Fellowship
 Literacy Leadership Practices
 Real World Problem Solving
 Create “Literacy Materials Centers”
 Early Identification w/ Intense/Focused
Remediation
 Research-Informed Decision Making
 Involve Families
 Needs-Based Scheduling
 Matching the Most Successful Teachers
with “Critical Condition” Kids
READ 180, Our Targeted
Intervention
Overview provided by
Elizabeth Heeren, SR Grant Coordinator
Memphis City Schools
Program Components
Support materials for
differentiated
instruction in small
group rotation
Student workbooks
for Independent
Practice in small
and whole group
rotations
Tools for student
placement and
assessment
Key Elements of READ 180
•
•
•
•
•
•
Fidelity of Implementation
90 minute classes
Certified teachers (LA or Reading)
District Instructional Support
District Technological Support
Scholastic training (site-based and on-line)
R180 Correlations to Reading Next
Recommendations for Adolescent Literacy
•
•
•
•
•
•
•
•
Direct, explicit comprehension instruction
Motivation and self-directed learning
Strategic tutoring
Differentiated texts (levels and topics)
Technology component
Ongoing formative assessment
Extended time for literacy
Professional development (long-term and ongoing)
Memphis Implementation
• We have 8 schools in the Striving Readers
Grant, with up to 120 randomly selected R180
students at each school.
• Students receive R180 instruction for 2 years.
• Each student placed in R180 falls in the lowest
quartile of TCAP (Reading score).
• Each student in R180 is paired with a similar
student from the lowest quartile who does not
receive the treatment (for impact comparison).
MSRP Research Design
Overview presented by
Jill Feldman, SR Research Director
Research for Better Schools
Overall MSRP Goals
To determine:
1.
The effects of MCLA on core subject teachers’ knowledge and use
of SBRR
2.
The separate and combined effects of MCLA and Read 180 on
students’ reading achievement levels, especially students who are
identified as struggling readers
3.
The separate and combined effects of MCLA and Read 180 on
students’ achievement in core subjects, especially students who
are identified as struggling readers
MCLA Program Logic Model
Inputs
Funding, staff, curriculum resource
center, facilities, incentives,
research materials
Activities
Principals
Attend 45-hour sessions/yr (2 yrs)
Participate in motivational,
recruitment and celebratory events
Discuss MCLA at faculty meetings
Conduct walkthrough observations
Provide opptys for teacher collab
Allocate space for CRC materials
Teachers
Attend 30 weekly 3-hour MCLA
training sessions/yr (2 years)
Develop and implement 8 CAPs
per year in collab content-area
groups
Meet with coaches for feedback to
improve impl of MCLA strategies
Learn to use of leveled texts to
support SR content literacy needs
Students
Learn to use MCLA strategies to
read/react to content related text (
Outputs
Principals
45 hours of Principal
Fellowship participation
100% of principals incorporate
plan for using MCLA strategies
in SIP
100% attendance of key MCLA
events
80% of principals report
actively supporting the
program
100% of MCLA schools have
allocated space for the CRC
Teachers
90 of hours of MCLA training/yr
for 2 years (180 hours)
Engage in weekly coaching
sessions or as needed to meet
teachers’ differentiated needs
8 CAP “cycles” completed
each year for two years
100% of teachers complete
performance measures
identifying supplemental
resources available/those
necessary to support content
area instruction
Students
50% of students attend 4
classes taught daily by
teachers participating in MCLA
Students learn to use 7 of 8
MCLA CAP strategies
Short–term Outcomes
Principals
Awareness of and interest in staff
implementation of MCLA
concepts and strategies
Increased advocacy for schoolwide use of MCLA strategies
Long-term Outcomes
Principals
Improved school climate
School-wide plans include focus on
content literacy
Improved instructional leadership
Teachers
Increased knowledge about
MCLA strategies
Improved preparedness to use
research-based literacy
strategies to teach core academic
content
Increased use of direct, explicit
instruction to teach researchbased comprehension, fluency,
and vocabulary strategies in
content area classes
Integrated use of multiple MCLA
strategies to support ongoing
development of content-related
instructional units
Teachers
Increased effectiveness supporting
students’ content literacy development
Continued collaboration among
community of teachers to develop and
implement CAPs
Students
Improved reading achievement and
content literacy:
10% increase in students scoring
proficient in Reading/LA and other
subject areas of TCAP
mean increase of five NCEs on ITBS
Students
Increased familiarity with and use
of MCLA strategies when
engaging with text
Increased internalization of
literacy strategies
Increased confidence engaging
with content related texts
Increased interest in
school/learning
Increased performance on gateway
and EOC exams
Higher Quality
Teaching
Higher Student
Achievement
Study Design and Analytic Approach:
MCLA
Study Design MCLA:
Analytic Approach MCLA:
• Evaluate teacher and student
outcomes
• Two-level HLM
– experimental design
– randomly assigning schools
(to treatment and control conditions)
• Teacher outcomes include
–
–
preparedness
frequency of literacy strategy use
– spring ITBS and TCAP scores as
a function of teacher and school
variables
Analytic Decisions
• Missing Data
– students missing pretest score(s) deleted from impact analysis
on relevant measure(s)
– teachers missing pretest score deleted from impact analysis on
measure
• Covariates
– include all student- and school-level covariates in the model
– run the model
– eliminate the school covariate with the lowest significance level
(highest p-value) not less than 0.2
– repeat steps 2 and 3 until the remaining covariates had p-values
less than 0.2
– repeat steps 2-4 for the student covariates
MCLA: Random Assignment of Schools
Demographic Characteristics of Year 1 MCLA
Student Sample
Student Characteristic
Controla
Treatment a
All Schools a
Enrolled in Grade 6
817 (31.6%)
690 (28.4%) 1507 (30.1%)
Enrolled in Grade 7
945 (36.6%)
883 (36.3%) 1828 (36.5%)
Enrolled in Grade 8
821 (31.8%)
857 (35.3%) 1678 (33.5%)
Female
1295 (50.1%)
1291 (53.1%) 2586 (51.6%)
Male
1288 (49.9%)
1139 (46.9%) 2427 (48.4%)
African-American
2375 (91.9%)
2374 (97.7%) 4749 (94.7%)
Hispanic
193
(7.5%)
49
(2.0%)
242
(4.8%)
Free or Reduced Lunch
2235 (86.5%)
2175 (89.5%) 4410 (88.0%)
English Language Learner
143
(5.5%)
27
(1.1%)
170
(3.4%)
Total
2583 (100%)
2430 (100%) 5013 (100%)
a
P ercentages are based on the total numbers of students in control, treatment, or all schools.
TCAP
ITBS
Baseline Comparisons of Students in
MCLA Treatment and Control Schools
a
Comparison of Students in MCLA Treatment and Control Schools on Baseline 2006
Scores on Each Achievement Test
Unadjusted Means
Adjusted Means
Est.
Effect
Test Score
Size
Control
Treatment Control Treatment Impact
T otal Reading
205.7
204.3
208.6
200.8
-7.8
0.34
a
Standard Score
(2235)
(2119)
Comprehension
203.8
203.3
207.7
198.6
-9.1
0.34
Standard Score
(2240)
(2133)
Vocabulary
207.5
205.3
207.9
204.5
-3.4
0.14
Standard Score
(2244)
(2129)
Reading/LA
502.2
502.9
507.8
496.2
-11.6
0.36
Scale Score
(2350)
(2294)
Mathematics
505.4
502.9
507.4
500.6
-6.8
0.19
Scale Score
(2347)
(2293)
Science
187.7
190.2
189.3
188.4
-0.9
0.05
Scale Score
(2308)
(2285)
Social Studies
193.0
192.0
196.3
188.5
-7.8
0.47
Scale Score
(2312)
(2278)
Signif.
Level
Numbers in parentheses are the number of students in each group havingvalid test scores from the baseline
2006 administrations andthe Spring 2007 administrations.
0.003
0.004
0.032
0.107
0.126
0.515
0.071
Selected Characteristics of the Year 1 Teacher Sample
for MCLA Impact Analyses
Te ach er C h aracteristic
C on trol a
Tre atme n t a
Total a
T eaches Language Arts
32.1%
37.5%
34.8%
T eaches Mathematics
20.1%
19.1%
19.6%
T eaches Science
17.9%
18.4%
18.1%
T eaches Social Studies
19.4%
20.6%
20.0%
Female
74.2%
74.2%
74.2%
Male
25.8%
25.8%
25.8%
African-American
86.7%
88.0%
87.4%
Masters Degree or Higher
53.9%
59.8%
56.9%
Licensed in Grade/Subject T aught
85.4%
79.3%
82.3%
P rior MCLA P articipation
13.3%
5.4%
9.3%
P rof. Dev. in Integrating Literacy in Class
44.2%
39.5%
41.9%
(more than 8 hours in past 12 months)
More than 5 Years Full-Time T eacher
67.8%
57.6%
62.6%
More than 5 Years Full-Time at Current School
14.4%
13.3%
13.9%
More than 5 Years Full-Time in Memphis
52.2%
44.4%
48.4%
a
These percentages are based on different numbers of teachers due to variations in response rates t o
different items on the teacher survey.
All Variables Included in MCLA Impact
Analytical Models for Year 1
Vari able
De pe n de n t
Year-End Preparedness Index
Year-End Frequency Index
In de pe n de n t
School Receiving MCLA Intervention
C ovari ate s
Baseline P reparedness Index
Baseline Frequency Index
English Language Arts T eacher
Age
Gender
African-American
Masters Degree or Higher
Licensed in Grade/Subject T aught
P rior MCLA P articipation
P rof Dev in Integrating Literacy in Class
Years Full Time T eacher
Years Full Time at Current School
P ercentage Female (Fall 2006)
P ercentage African-American (Fall 2006)
P ercentage Special Ed (Fall 2006)
P ercentage FRL (Fall 2006)
P ercentage ELL (Fall 2006)
School Enrollment (Fall 2006)
Le vel
C odi n g/ Ran ge
T eacher 1-5; Not at All; A Lit tle; P repared; Well P repared;Could T each Others
T eacher 1-5; Never; Rarely; Sometimes; Often; Almost Always
School
Yes = 1; No = 0
T eacher
T eacher
T eacher
T eacher
T eacher
T eacher
T eacher
T eacher
T eacher
T eacher
T eacher
T eacher
School
School
School
School
School
School
1-5; 5 represents highest preparedness
1-5; 5 represents highest frequency
Yes = 1; No = 0
1-6: 20Õs; 30Õs; 40Õs; 50Õs; 60Õs; 70Õs
Female = 1; Male = 0
Yes = 1; No = 0
Yes = 1; No = 0
Yes = 1; No = 0
Yes = 1; No = 0
1-4: None; 1-8 hrs; 9-32 hrs; 32+ hrs
1-7: Never; 0-2; 3-5; 6-10; 11-20; 21-30; 30+
1-7: Never; 0-2; 3-5; 6-10; 11-20; 21-30; 30+
0-100
0-100
0-100
0-100
0-100
400-1200
READ 180 Logic Model
R180 Study Design Analytic Approach
Study Design:
Analytic Approach:
• Evaluate student outcomes
using RCT based on random
assignment of students to
conditions across schools
• Cross-sectional ITT analyses
of reading and core content
area achievement
• Student outcome measures:
– reading achievement (ITBS)
– core content areas (TCAP)
• Two-level models using spring
ITBS and TCAP scores as a
function of student and school
variables
READ 180: Enrolled Students
Demographic Characteristics of the Year 1 Read 180 ITT Sample
Student Characteristic
Enrolled in Grade 6
Enrolled in Grade 7
Enrolled in Grade 8
Female
Male
African-American
Hispanic
Free or Reduced Lunch
English Language Learner
Total
Controla
392 (37.6%)
370 (35.5%)
280 (26.9%)
465 (44.6%)
577 (55.4%)
955 (91.7%)
86
(8.3%)
931 (89.3%)
83
(8.0%)
1042 (100%)
Treatment a
239 (34.2%)
233 (33.4%)
226 (32.4%)
286 (41.0%)
412 (59.0%)
657 (94.1%)
40 (5.7%)
619 (88.7%)
34 (4.9%)
698 (100%)
Total a
631 (36.3%)
603 (34.7%)
506 (29.1%)
751 (43.2%)
989 (56.8%)
1612 (92.6%)
126 (7.2%)
1550 (89.1%)
117 (6.7%)
1740 (100%)
Variables Included in READ 180 Impact
Analytic Models (Year One):
Dependent and Independent
Variable
Dependent
Spring 2007 IT
BS Total Reading
Spring 2007 IT
BS Comprehension
Spring 2007 IT
BS Vocabulary
Spring 2007 T
CAP Reading/LA
Spring 2007 T
CAP Mathematics
Spring 2007 T
CAP Science
Spring 2007 T
CAP Social Studies
Independent
Read 180 Participation
Level
Coding / Range
Student
Student
Student
Student
Student
Student
Student
Standard Score 100-350
Standard Score 100-350
Standard Score 100-350
Scale Score 300-750
Scale Score 300-750
Scale Score 100-300
Scale Score 100-300
Student
Yes = 1; No = 0
Variables Included in READ 180 Impact
Analytic Models (Year One): Covariates
Variable
Covariates
Fall 2006 ITBS Total Reading
Fall 2006 ITBS Comprehension
Fall 2006 ITBS Vocabulary
Spring 2006 T
CAP Reading/LA
Spring 2006 T
CAP Mathematics
Spring 2006 T
CAP Science
Spring 2006 T
CAP Social Studies
Gender
African-American
Hispanic
Free or Reduced Lunch (FRL)
English Language Learner (ELL)
Enrolled in G rade 7
Enrolled in G rade 8
Percentage Female (Fall 2006)
P ercentage African-American (Fall 2006)
P ercentage Special Ed (Fall 2006)
P ercentage FRL (Fall 2006)
P ercentage ELL (Fall 2006)
School Enrollment (Fall 2006)
Level
Coding / Range
Student
Student
Student
Student
Student
Student
Student
Student
Student
Student
Student
Student
Student
Student
School
School
School
School
School
School
Standard Score 100-350
Standard Score 100-350
Standard Score 100-350
Scale Score 300-750
Scale Score 300-750
Scale Score 100-300
Scale Score 100-300
Female = 1; Male = 0
Yes = 1; No = 0
Yes = 1; No = 0
Yes = 1; No = 0
Yes = 1; No = 0
Yes = 1; No = 0
Yes = 1; No = 0
0-100
0-100
0-100
0-100
0-100
400-1200
Year One Impact
Comparison of Teachers in MCLA Treatment
and Control Schools on Year-End Indices for
Preparedness and Frequency of Use
Test Score
Preparedness Index
Frequency Index
a
Unadjusted Means
Control Treatment
3.57
3.92
a
(49)
(49)
3.69
3.93
(49)
(43)
Adjusted Means
Est.
Control Treatment Impact
Effect
Size
Signif.
Level
3.52
3.93
0.41
0.75
0.012
3.64
4.00
0.36
0.61
0.022
Numbers in parentheses are the number of teachers in each group havingvalid index scores from th e
baseline 2006 administration and the Spring 2007 administration.
TCAP
ITBS
MCLA Impacts on Students (Year One)
a
Comparison of Students in MCLA Treatment and Control Schools on Spring 2007
Scores on Each Achievement Test
Unadjusted Means
Adjusted Means
Est.
Effect Signif.
Test Score
Size
Level
Control
Treatment Control Treatment Impact
T otal Reading
208.8
208.8
207.8
207.6
-0.2
0.01
0.900
a
Standard Score
(1925)
(1831)
Comprehension
205.7
205.8
202.9
207.1
4.2
0.13
0.067
Standard Score
(1932)
(1835)
Vocabulary
211.8
210.2
211.8
208.9
-2.9
0.12
0.125
Standard Score
(1938)
(1854)
Reading/LA
517.0
515.1
519.3
513.6
-5.7
0.18
0.000
Scale Score
(2301)
(2240)
Mathematics
522.4
515.1
521.2
515.1
-6.1
0.17
0.061
Scale Score
(2297)
(2240)
Science
192.2
193.1
193.1
192.0
-1.1
0.07
0.355
Scale Score
(2212)
(2222)
Social Studies
193.5
191.4
193.2
191.3
-1.9
0.13
0.345
Scale Score
(2205)
(2212)
Numbers in parentheses are the number of students in each group havingvalid test scores from the baseline
2006 administrations andthe Spring 2007 administrations.
READ 180 Impacts on Students (Year 1)
TCAP
ITBS
C ompari sonof Re ad 180 Tre atment and C ontrol Grou ps on
S pri n g 2007 S core s on Each Ach i e veme nt Te st
Un adju ste d
Adju ste dMe an s Est. Effe ct
Me an s
Te st S core
Imp
S ize
C on trol Tre at C on trol Tre at
Total Reading
191.8
192.9
192.6
192.1
-0.5
0.03
a
Standard Score
(712)
(511)
Comprehension
186.7
187.6
187.0
187.0
0.0
0.00
Standard Score
(718)
(519)
Vocabulary
197.0
198.3
197.5
197.6
0.1
0.00
Standard Score
(726)
(519)
Reading/LA
495.8
498.0
496.9
497.1
0.2
0.01
Scale Score
(972)
(664)
Mathematics
500.0
501.8
500.0
500.2
0.2
0.00
Scale Score
(971)
(661)
Science
185.1
185.6
185.6
185.1
-0.5
0.03
Scale Score
(906)
(643)
Social Studies
185.1
186.1
185.0
185.8
0.8
0.05
Scale Score
(906)
(644)
a
Numbers in parentheses are the number of students in each group havingvalid test scores from
the baseline 2006 administrations andthe Spring 2007 administrations.
S i g.
0.532
0.976
0.937
0.882
0.904
0.573
0.323
Collection of Data about
Implementation Fidelity
Implications for Researchers and
Practitioners
What are our purposes for collecting implementation data?
1. To provide other districts with information about outcomes they
might expect when implementing similar interventions with their
struggling readers*
1. To set the context for understanding student outcomes
*Requires MCS to place the needs of the field above local concerns
Reasons to Collect “Double Data”
R180 evaluation is intended to test effects of a
replicable intervention in the real-world:
1. Without the support of external evaluators
2. In ways that emulate what districts will need to do to:
• monitor implementation
• obtain process feedback
Reasons to Collect “Double Data”
Collecting data about MCLA and R180 fidelity
• helps researchers explain patterns of impact findings
• can be useful in identifying predictors of outcomes
What Is the Role of the Researcher?
• RBS collects data about:
– Impact (MCLA & R180)
– Implementation fidelity
• To better understand impact or lack thereof
(MCLA & R180)
• To support development of MCLA (only)
– Counterfactual
• To compare effects to what would have happened
in SR schools in the absence of MSRP
What is the Role of MCS?
• Implement R180 & MCLA
• Monitor the implementation process
– Ensure implementation is “on model”
– Refine service delivery based on formative
data
Defining Implementation Fidelity:
MCLA
Innovation Configuration
Mapping
MCLA Implementation Framework
• Developing an Innovation Configuration (IC) Map
(Hall & Hord, 2006)
– Operationally defines levels of implementation fidelity among
clusters of “key active ingredients”
– Iterative process involving key stakeholders
• Development team (University of Memphis)
• Grantee (Memphis City Public Schools)
• Researchers (Research for Better Schools)
MCLA:
Roles & Responsibilities
MCS Administrators:
• Participate in Principal’s Fellowship
• Support recruitment and retention efforts
• Link MCLA w/School Improvement Plan
• Observe MCLA teachers
(once/marking period)
• Allocate space for CRC materials
• Protect/respect role of coach
Developer:
• Design MCLA curricula
(for teachers & principals)
•
•
•
Facilitate writing team activities
Meet weekly with instructors (& coaches)
Disseminate research about adolescent SR
MCLA Training
Provided by the Developer:
•
•
•
3-hour weekly principal meetings
(fall;Year 1)
3-hour weekly teacher training sessions per content area
(180 hours over 2 years)*
PD for coaches in
Mentorship; Urban education; Adolescent lit
Provided by MCS (coaches):
•
•
•
•
On-site observation of CAPs
Model/co-teach strategies
Feedback
Supplemental resources
*has included coaches since spring 2007
MCLA Innovation Configuration Map Framework
Instrument Development
With the IC map guiding development, the following
measures were designed to collect data about MCLA
implementation:
• Surveys
– Teacher knowledge about & preparedness to use MCLA
strategies
– Teacher demographic characteristics
– Teachers’ MCLA Feedback
• Interviews
– Principals, coaches, development team, and MCS administrators
• Teacher Focus Group Discussions
Operationally defining components:
“Job Definition”
Aligning the IC Map and Instrument Development:
“Job Definition” – Teacher Survey
“Job Definition” - Principal Interviews
MCLA Innovation Configuration Map Framework
Where the rubber hits the
“runway”…
MCLA Classroom Implementation
Operationally defining components:
Implementation of Lesson Plans
Implementation of lesson plans:
Collecting classroom observation data
MSR-COP
Data Matrix
Record Interval Start & End Times 
Interval 1
:
–
:
Interval 2
:
–
:
Interval 3
:
–
:
Interval 4
:
–
:
Instructional Mode(s)
Literacy Strategy(ies)
Cognitive Demand
Level of Engagement
Instructional Mode Codes
AD
A
CD
Administrative Tasks
Assessment
Class discussion
J
LC
L
Jigsaw
Learning center/station
Lecture
SGD
SP
TIS
Small-group discussion
Student presentation
Teacher/instructor interacting w/ student
DI
LWD
Lecture with discussion/whole-class
instruction
Out-of-class experience
TA
Think-alouds
OOC
TPS
Think-Pair-Share
GO
Direct, explicit instruction
related to a literacy strategy
Drill and practice (on paper,
vocally, computer)
Graphic organizer
TM
Teacher modeling
V
Visualization (picturing in one’s mind)
HOA
Hands-on activity/materials
RSW
Reading seat work (if in groups, add SGD)
WW
Writing work (if in groups, add SGD)
I
Interruption
RT
Reciprocal teaching
DP
1 = Remember
2 = Understand
3 = Apply
Cognitive Demand Codes
Retrieve relevant knowledge from long-term memory (recognize, identify, recall)
Construct meaning from instructional messages, including oral, written, and graphic
communication (interpret, exemplify, classify, summarize, infer, compare, explain)
Carry out or use a procedure in a given situation (execute, implement, use)
4 = Analyze
Break material into its constituent parts and determine how the parts relate to one another and to
an overall structure or purpose (differentiate, organize, attribute, outline)
5 = Evaluate
Make judgments based on criteria and standards (check, coordinate, monitor, test, critique, judge)
6 = Create
Put elements together to form a coherent or functional whole; reorganize elements into a new
pattern or structure (generate, hypothesize, plan, design, produce, construct)
Level of Engagement Codes
LE = low engagement, ? 80% of students off-task
ME = mixed engagement
HE = high engagement, ? 80% engaged
Implementation of lesson plans:
Collecting classroom observation data
4.2
LITERACY ACTIVITY CODES
VOCABULARY STRATEGIES
B
Bubble or double-bubble map
M
Mnemonic strategies
CC
Context clue
PT
Preteaching vocabulary
E
Etymology
SFA
Semantic feature analysis, maps, word grid
G
Glossary or dictionary use
WS
Word sorts
IW
Interactive word wall use
FLUENCY STRATEGIES
CR
Choral reading/whole group reading
RR
Repeated oral reading
LM
Leveled content materials
TRA
Teacher models/reads aloud passage
PB
Paired or buddy reading
COMPREHENSION STRATEGIES
PV
Previewing text
APR
Activate prior knowledge
CT
Connecting text to students’ lives
RT
Retelling/summarizing with guidance
Q
Questioning for focus/purpose
GR
Retelling with graphics
MU
Monitoring understanding
OR
Oral retelling
QAR Question-answer relationships/ ReQUEST
(T.H.I.E.V.E.S., L.E.A.R.N., and S.E.A.R.C.H.)
REF
Reflection/metacognition
SGQ
Students generating questions
WRITING STRATEGIES
JU
Journal or blog use
SW
Shared writing
WR
Written retelling
MCLA: Implementation Barriers
Barriers:
• Limited development/planning time
• Need for coaches with disciplinary content knowledge
• Challenges in establishing a critical mass of enrolled teachers at
each school
• CRC materials not received until spring 2007
• Pressure to focus on TCAP test preparation (spring)
• Difficulty maintaining principal attendance at weekly meetings
MCLA: Planned Implementation
Changes
Changes:
•
•
•
•
Adoption of CREDE (UC-Berkeley) JPA instructional model
Reduction in the number of CAPs required of teachers
Shortened class schedule/more intensive work with coaches
Inclusion of special education teachers among those eligible to
enroll
• Restructured Principal Fellowship
(includes other school leaders; meets monthly)
Defining Implementation Fidelity:
R180
Rorie Harris
Memphis City Public Schools
Findings Related to Implementation
• Scheduling
– Scheduling 90 minute blocks in schools using the
Middle School concept is difficult. Teams of core
content teachers traditionally have 55 minute classes.
– Interruptions to the 90 minute block occur.
• Special Education Students
– READ 180 will only suffice as a SPED student’s
intervention if the teacher is SPED-certified.
Findings Related to Implementation
• Use of Technology
– Technology issues can negatively affect instructional
time.
• Parents & Students
– Some parents do not want their children in Reading
Intervention classes. They feel like this is a “label.”
– Classroom management issues impact instruction.
– Student mobility affects the scope and sequence of
reading instruction.
Findings Related to Implementation
• School Administration
– Without administrator “buy-in” to the importance of
smaller classes and protection of the 90 minute block,
fidelity is not supported.
• Read 180 Teachers
– It is challenging to encourage ALL teachers to engage
in on-line professional development and/or to attend
network meetings.
– Teacher turn-over brings out the need for repeated
initial training and reduces the development of
teacher leaders.
Indicators of Read 180 Implementation
• Scholastic identifies several key program
aspects
– Teacher Training/Professional Development
– Computer Hardware/Software Use
– Use of Read 180 Materials
– Group Rotation
– Class Size
– Classroom Environment
– Student Engagement
Sources of Implementation Data
• Classroom observations during the school year
(Fall & Spring)
• Read 180 program databases (SAM)
• Enrollment and course-related data from district
databases
• Surveys administered to students (Fall & Spring)
and teachers (Spring)
• Information collected during professional
development programs
MCS Data Linked to Implementation
Indicators
MCS Data Source
Key Program Area
Completion of Scholastic RED
Course
•Teacher Training
Attendance at district-wide
Read 180 Network Meetings
•Teacher Training
Fall & Spring Classroom
Observations
•Computer Hardware & Software
Use
•Group Rotations
•Class Size
•Classroom Environment
•Use of Read 180 Materials
Enrollment Data
•Class Size
MCS Data Linked to Implementation
Indicators
MCS Data Source
Key Program Area
Student Usage Data from SAM
•Computer Hardware & Software
Usage
Student Surveys
•Classroom Environment
•Student Engagement
•Use of Read 180 Materials
Teacher Survey
•Computer Hardware/Software
Use
•Classroom Environment
•Group Rotations
•Use of Read 180 Materials
Overview of Year One
Conclusions
Jill Feldman, RBS
(Brief) Conclusions & Discussion
READ 180: No significant Year One student
impact
• Late startup
• (Most) students will receive two years of intervention
Planned Future Analyses:
• Three-level analyses planned to examine whether teacher
characteristics exert a moderating effect on student
outcomes
• Exploratory analyses of relationships between amount of
READ 180 instruction and effects on student outcomes
(Brief) Conclusions & Discussion
MCLA:
• Significant (moderate) impact on teachers’ frequency and
preparedness to use MCLA strategies
• No significant impact on students’ achievement in reading or
core content areas
Discuss:
– Subjectivity of measure (“Hawthorne Effect”)
– Teacher findings support program logic model
– Explore relationship between impact and participation in PD
Next Steps…
Planned Exploratory Analyses
• Re-run HLM impact analyses to test
effects of teacher variables on outcomes
– Preparedness and use of MCLA strategies
– Age
– Experience as teacher (& years at MCS)
– PD in year prior to MCLA
Planned/ongoing analyses
• Individual student’s growth over time
• Rerun HLM with student-level variables
– # MCLA teachers
– Student’s school attendance
• ITS analyses
– Using TCAP Spring 2003 & 2004 scores
• Correlating R180 data with TCAP & ITBS
– for possible use as covariates in HLM
Now It’s Your Turn
• Ask the panel
• Share your experiences
– Triumphs
– Tribulations
Thank you for joining us!
For additional information contact:
[email protected]