Transcript Document

The Research Design
Research for Better Schools
Philadelphia, PA
Jill Feldman, Ph.D., Director of Evaluation
What research questions will we ask
about MSRP impact?
1.
Does MCLA effect core subject teachers’ knowledge
and use of research-based literacy strategies?
2.
What are the separate and combined effects of MCLA
and Read 180 on students’ reading achievement
levels, especially students identified as struggling
readers?
3.
What are the separate and combined effects of MCLA
and Read 180 on students’ achievement in core
subjects, especially students identified as struggling
readers?
What outcome measures will we
use?
1. Iowa Test of Basic Skills (ITBS)
•
Vocabulary, Fluency, Comprehension
2. TCAP
•
Reading, Social Studies, Science, Mathematics
3. Gateway and End of Course Assessments
•
ELA, Mathematics, and Science
What research questions will we ask
about MSRP implementation?
1. To what degree do the implemented MCLA & R180
treatments match the intended program standards and
features?
2. What contextual district and school level factors may
be influencing the implementation of MCLA & R180?
3. How do the professional development events,
materials, or structures present in the control schools
compare to what is present in the treatment schools?
Research Design for MCLA
• 4 matched pairs of schools (N=8) randomly assigned to
treatment (MCLA) or control (no MCLA) condition
• Content area teachers in cohort 1 to participate in MCLA
for Years 1 and 2
• Control group teachers (cohort 2) to participate in MCLA
in Years 3 and 4
MCLA: Random Assignment of Schools
MCLA: Exploring Efficacy
Attempts to address questions about whether or
not MCLA can work
• Depends upon rapid turnaround of data collected
• Relies upon formative feedback to guide program
revisions
• Requires close collaboration among project stakeholders
–
–
–
–
To develop measures
To share information and data
To communicate regularly about changes and challenges
To troubleshoot and cooperatively address challenges
Research Design for Read 180TM
• Random assignment of “eligible” students
enrolled at 8 SR schools, where eligibility
means:
– No prior participation in READ 180TM
– Two or more grade levels behind in reading
– Scores in bottom quartile on state assessment (TCAP)
• READ 180TM is the treatment
• Counterfactual (business as usual*) is the
control
Read 180: Random Assignment of
Students
Read 180: Exploring Effectiveness
Attempts to address questions about whether or
not Read 180 will work…
• Provides evidence about what happens when R180 is
implemented “off the shelf,” (without formative eval
support)
• Requires MCS to set aside local need for feedback to
address questions of importance to field
• Establishes a one-way firewall between MCS and RBS
Please review the safety card in the
seat pocket…
1.
2.
3.
4.
5.
6.
Balance local knowledge of students’ needs within the
identified “eligible pool” without creating selection bias
Address high rates of student mobility
Accurately describe the counterfactual
Obtain parental consent (and students’ assent) to
administer the ITBS
Design procedures to prevent crossover
Deal with (inevitable) startup delays
Air Traffic Control:
Did Random Assignment Work?
Are the student groups comparable?
• Students eligible for READ180 ™: N = 2,277
Total students in 8 SR schools: N = 6,170
Students eligible as % of total: 36.9%
• No differences in race, gender, ethnicity, or poverty level
between conditions
• Higher % of ELLs in control group (87 of 1,337 students,
or 6.5%) than in R180™ (35 of 940 students, or 3.7%)
• Higher % of Sp Ed 8th graders in R180™ group (28.2%)
vs control (20.9%)
What, how, and from whom should data be
collected?
• Use multiple measures and methods
–
–
–
–
–
Interview developers, instructors, coaches, & principals
Surveys of teacher knowledge and attitudes
Focus group discussions with teachers
Evaluator observations of PD sessions
Evaluator observations of classroom implementation
• Use data to challenge/confirm findings from single sources
• Share findings with key stakeholders to determine whether:
– data collected are appropriate to support decision making
– evaluation findings reflect actual experiences
– revisions to the logic model, IC map, and/or instruments are needed
Helen/Bob’s piece here…
The Flight Plan
The MCLA Program Logic Model
Memphis Content Literacy Academy Evaluation Logic Model
Outputs
Inputs: Funding, staff, curriculum
resource center, facilities,
incentives, research materials
PPrincipals
# hours of Principal
Fellowship participation
Short–term Outcomes
# of MCLA events attended
PPrincipals
Awareness of and interest in staff
implementation of MCLA
concepts and strategies
Principals
Attend four three-hour principal
fellowship sessions each year for two
(or four?) years
Teachers
# of hours of MCLA training
attended
Teachers
Increased knowledge of MCLA
strategies
Participate in motivational,
recruitment and celebratory events
# hours of coaching (contacts)
Improved preparedness to use
research-based literacy strategies
to teach core academic content
Activities
Discuss MCLA at faculty meetings
Conduct walkthrough observations
Provide opptys for teacher collab
Allocate space for CRC materials
Teachers
Attend # weekly MCLA training
Develop and implement 8 CAPs per
year?
Meet with coaches for feedback to
improve implementation of MCLA
strategies
Integrate use of leveled texts to
support development of content
literacy among struggling readers
Students
Use MCLA strategies to read/react to
content related text (independently? In
collaborative groups? Neither?
Both?)
# of CAPS implemented?
Observed? videotaped?
# of new lesson plans
integrating literacy in content
area lessons
# and type of materials
checked out of CRC
Students
# classes taught by teachers
participating in MCLA
Increased use of direct, explicit
instruction to teach reseachbased comprehension, fluency,
and vocabulary strategies in
content area classes
Integrated use of MCLA
strategies to support
development of content literacy
# MCLA strategies students
learn
Students
Increased familiarity with and
use of MCLA strategies when
engaging with text
# (freq?) of MCLA strategy
use
Increased internalization of
literacy strategies
Increased interest in
school/learning
Long-term Outcomes
Principals
Improved school climate
School-wide plans include
focus on content literacy
Improved instructional
leadership
Teachers
Increased effectiveness
supporting students’ content
literacy development
Continued collaboration among
community of teachers to
develop and implement CAPs
Students
Improved reading achievement
and content literacy:
10% increase in students
scoring proficient in
Reading/LA and other subject
areas of TCAP
mean increase of five NCEs
on ITBS (comprehension?
vocab?)
Higher Quality Teaching
& student achievement
Defining what will be evaluated
Developing the MCLA Innovation Configuration (IC) Map
• Involve diverse groups of stakeholders
•
•
•
•
The development team
The implementation team (MCS administrators & coaches)
Experienced users
Evaluators
• Identify major components of MCLA
• Provide observable descriptions of each component
• Describe a range of implementation levels
MCLA: The Conceptual Framework
Wheels Up:
Resisting Premature Use of “Auto Pilot”
With the IC map guiding development, the following
measures were designed to collect data a/b MCLA
implementation:
• Surveys
– Teacher knowledge about & preparedness to use MCLA
strategies
– Teacher demographic characteristics
– Teachers’ MCLA Feedback
• Interviews
– Principals, coaches, development team, and MCS administrators
• Teachers Focus Group Discussions
Operationally defining components:
“Job Definition”
14.1) Role and responsibilities of the Teacher with respect to literacy instruction
a
Job definition: When asked, teachers define their job as providing literacy
instruction along with their content instruction.
Content of lesson plans: Teachers’ lesson plans show that how they plan to
integrate instruction on literacy strategies with their instruction on subject matter
content.
Implementation of lesson plans: Observation of teachers’ lessons show that they
integrate instruction on literacy strategies with their instruction on subject matter
content.
b
c
d
e
When asked, teachers define their job as
covering required subject matter content.
Teacher lesson plans only show how they
plan to teach specific subject matter content.
Observation of teachers’ lessons show that
they only teach specific subject matter
content.
Aligning the IC Map and Instrument Development:
“Job Definition” – Teacher Survey
Strongly
Agree
Agree
Disagree
Strongly
Disagree
7. It was hard to find the time to attend MCLA classes every week.




8.
I believe using the strategies I learned in MCLA class will
improve students’ understanding of important content.
9. The MCLA materials were linked to the district’s content
standards.
10. My (main) MCLA class instructor modeled how to implement
each new strategy, at least once, from beginning to end.












11. I used class handouts to plan classroom instruction.




12. There is not enough time to add the use of literacy strategies to
the existing curriculum.
13. I was already familiar with much of the material covered in
MCLA classes.








14. I am satisfied with my MCLA class experience overall.




15. I would be willing to contribute a videotape of my CAP
implementation as a tool for use to train teachers.




16. I appreciate the chance to collaborate with colleagues in MCLA.




17. I found the Joint Productive Activities (JPA) to be useful.




18. My students would benefit from my use of JPAs during
instructional time.
19. I am unsure how useful the MCLA literacy strategies will be for
my students.
20. It has been difficult to equip my classroom with leveled reading
materials given all my other responsibilities.












21. I would recommend MCLA classes to fellow teachers.




22. MCLA supports achievement of other important goals in my
school’s improvement plan.




23. I look forward to resuming MCLA classes in the spring.




Thinking about MCLA classes THIS SEMESTER
(Fall ’07):
“Job Definition” - Principal Interviews
3. In your view, which staff are responsible for literacy instruction? [Probe: To what
extent do you expect content area teachers to address the literacy needs of struggling
readers?]
4. What are your school’s main student achievement improvement goals?
9. To the extent that you are familiar with MCLA, how connected/disconnected is the
Academy from your school’s current improvement plans? [Probe: Please describe
specific links (or disconnects) between MCLA and current improvement plans.]
10. When MCLA is implemented at your school, do you think it will require teachers to
do different things in addition to what is already expected of them? [Probe: If yes,
please describe whether the additional demands support or conflict with achievement of
other/more important priorities.]
11. Do you expect MCLA will have an effect on student learning?
14. What expectations, if any, do you have for teachers’ participation in MCLA next
year? [Probe: What percent of eligible teachers do you expect to enroll? Will specific
grade level, content area, or teams of teachers be encouraged to participate or will the
decision to enroll be left to the discretion of individual teachers?]
15. Thinking about next year, on a scale of 1 to 5 where “1” is not at all realistic, and
“5” is very realistic, how realistic is it to expect you to:
a.
allow teachers to observe each others’ classes/share ideas [Probe: If
response is a 3 or higher, ask “how often?”]
b.
discuss MCLA at faculty meetings?
c.
attend the annual MCLA kick-off event in August?
d.
attend MCLA teacher evening course sessions? [Probe: How many?]
e.
attend celebratory events (i.e., the Laureate ceremony)
f.
discuss MCLA with enrolled teachers?
g.
recruit new teachers to enroll in MCLA?
h.
observe teachers’ use of literacy strategies? [Probe: How often?]
i.
provide feedback to teachers about their use of MCLA strategies?
j.
participate in all Fellowship activities? [Probe: What level of participation
do you think is realistic?]
Where the rubber hits the
runway…
Classroom Implementation
Operationally defining components:
Implementation of Lesson Plans
14.1) Role and responsibilities of the Teacher with respect to literacy instruction
a
b
Job definition: When asked, teachers define their job as providing literacy
instruction along with their content instruction.
Content of lesson plans: Teachers’ lesson plans show that how they plan to
integrate instruction on literacy strategies with their instruction on subject
matter content.
Implementation of lesson plans: Observation of teachers’ lessons show that
they integrate instruction on literacy strategies with their instruction on subject
matter content.
c
d
e
When asked, teachers define their job as
covering required subject matter content.
Teacher lesson plans only show how they
plan to teach specific subject matter
content.
Observation of teachers’ lessons show that
they only teach specific subject matter
content.
Implementation of lesson plans:
Collecting classroom observation data
MSR-COP
Data Matrix
Record Interval Start & End Times 
Interval 1
:
–
:
Interval 2
:
–
:
Interval 3
:
–
:
Interval 4
:
–
:
Instructional Mode(s)
Literacy Strategy(ies)
Cognitive Demand
Level of Engagement
Instructional Mode Codes
AD
A
CD
Administrative Tasks
Assessment
Class discussion
J
LC
L
DI
GO
Direct, explicit instruction
related to a literacy strategy
Drill and practice (on paper,
vocally, computer)
Graphic organizer
HOA
Hands-on activity/materials
RSW Reading seat work (if in groups, add SGD) WW
I
Interruption
RT
DP
1 = Remember
2 = Understand
3 = Apply
Jigsaw
Learning center/station
Lecture
SGD
SP
TIS
Small-group discussion
Student presentation
Teacher/instructor interacting w/ student
LWD Lecture with discussion/whole-class
instruction
OOC Out-of-class experience
TA
Think-alouds
TPS
Think-Pair-Share
TM
V
Visualization (picturing in one’s mind)
Teacher modeling
Writing work (if in groups, add SGD)
Reciprocal teaching
Cognitive Demand Codes
Retrieve relevant knowledge from long-term memory (recognize, identify, recall)
Construct meaning from instructional messages, including oral, written, and graphic
communication (interpret, exemplify, classify, summarize, infer, compare, explain)
Carry out or use a procedure in a given situation (execute, implement, use)
4 = Analyze
Break material into its constituent parts and determine how the parts relate to one another and to
an overall structure or purpose (differentiate, organize, attribute, outline)
5 = Evaluate
Make judgments based on criteria and standards (check, coordinate, monitor, test, critique, judge)
6 = Create
Put elements together to form a coherent or functional whole; reorganize elements into a new
pattern or structure (generate, hypothesize, plan, design, produce, construct)
Level of Engagement Codes
LE = low engagement, ? 80% of students off-task
ME = mixed engagement
HE = high engagement, ? 80% engaged
Implementation of lesson plans:
Collecting classroom observation data
4.2
LITERACY ACTIVITY CODES
VOCABULARY STRATEGIES
B
Bubble or double-bubble map
M
Mnemonic strategies
CC
Context clue
PT
Preteaching vocabulary
E
Etymology
SFA
Semantic feature analysis, maps, word grid
G
Glossary or dictionary use
WS
Word sorts
IW
Interactive word wall use
FLUENCY STRATEGIES
CR
Choral reading/whole group reading
RR
Repeated oral reading
LM
Leveled content materials
TRA
Teacher models/reads aloud passage
PB
Paired or buddy reading
COMPREHENSION STRATEGIES
PV
Previewing text
APR
Activate prior knowledge
CT
Connecting text to students’ lives
RT
Retelling/summarizing with guidance
Q
Questioning for focus/purpose
GR
Retelling with graphics
MU
Monitoring understanding
OR
Oral retelling
QAR Question-answer relationships/ ReQUEST
(T.H.I.E.V.E.S., L.E.A.R.N., and S.E.A.R.C.H.)
REF
Reflection/metacognition
SGQ
Students generating questions
WRITING STRATEGIES
JU
Journal or blog use
SW
Shared writing
WR
Written retelling
Please remain seated with your
seatbelts fastened…
• Timely turnaround of data summaries
• Team meetings to debrief/interpret findings
• Testing what you think you “know:”
–
–
–
–
Productive (& challenging) conversations
Data-driven decision making
Taking Action
Following up (ongoing formative evaluation feedback)
• Elizabeth’s piece here
Complimentary Refreshments:
CRC Materials
Complimentary Refreshments:
CRC Materials
Category
National Geographic-Life Science/Human Body
Social Studies - Various Materials
National Geographic-US History and Life
National Geographic-Earth Science
National Geographic-Life Science
Science - Various Materials
National Geographic-Math Behind the Science
Professional Library
National Geographic-Science Theme Sets
Mathematics - Various Materials
National Geographic-Social Studies Theme Sets
National Geographic-Ancient Civilizations
National Geographic-Physical Science
Professional Development
Science Matters/Visual Science Encyclopedia
Science Theme Sets
US Regions
Resources Used
(N=235)
45
25
30
19
17
15
21
13
13
10
6
4
4
3
3
5
2
%
19.1%
10.6%
12.8%
8.1%
7.2%
6.4%
8.9%
5.5%
5.5%
4.3%
2.6%
1.7%
1.7%
1.3%
1.3%
2.1%
0.9%
Percentage Distribution of Planned Coaching Activities
Logged in Year 1 (N=4,233 entries logged)
Activity
Coach’s administrative tasks
Conferencing with teachers
Observation
School administrative tasks
Collaborative teacher support
Coach’s professional development
Assisting teachers in class
Striving Readers evaluation tasks
Helping teachers prepare
Modeling
Videotaping/other
Frequency
1358
716
698
339
330
303
138
138
71
59
73
Percentage
32.2
17.0
16.5
8.0
7.8
7.2
3.3
3.3
1.7
1.4
1.7
Ground Transportation:
The Coaching Role
Trust b/w coach and teacher(s) is critical:
• To provision of CAP implementation support
• Pre-conference meeting
• CAP Observation
– Co-teaching; modeling
– Videotapes for use to train teachers, coaches, evaluators
• Post observation conference
• To effective and strategic selection of CRC &
supplemental resources
Avoiding Wind Shear…
Team’s unwavering commitment to helping
teachers support the success of struggling
adolescent readers
sum > individual parts
…and we have the data to prove it!
Across grade levels, the picture is the same…
8th Graders’ Reading Levels
School-wide comparisons with schools
nation-wide