Transcript Slide 1

“She Was My Backbone”: Measuring the Impact of Literacy Coaching

Kelly Feighan, Research for Better Schools Dr. Elizabeth Heeren, Memphis City Schools

http://www.ed.gov/programs/strivingreaders/awards.html

Grantee: Memphis City Schools Memphis, Tennessee Total Grant Award: $16,074,687

Memphis' Striving Readers project is designed to test the efficacy of the Memphis Content Literacy Academy (MCLA) professional development model for improving reading achievement and content literacy in high-need urban middle schools serving grades 6-8. All content teachers in Striving Readers treatment schools are eligible for participation in MCLA, which includes: 1. University Coursework (2 years:12 hours of upper division college credit) 2. Support from a site-based literacy coach 3. Access to differentiated instructional materials

Challenges in Hiring/Training Literacy Coaches

1. No certification available in state of Tennessee (and many other states) 2. Standards for coaching not set (we used IRA and NCTE as a guide) 3. Roles are undefined and differ depending on context

Qualifications We Used for Hiring

• 5+ years successful teaching in middle school • Advanced degree (Masters +) • Experience with literacy • Experience with professional development • Various content areas • Principal recommendations

Coaching Cycle

Coaches Report on Their Roles in the Schools

•Coaches consistently report their role as: providing support, advocating for teachers, and modeling lessons.

•All coaches feel part of the “school family.” •The coaching experience has improved with time. •Main challenge: limited time to observe CAP implementation between assignments.

The data collection tool used was designed by the team after one year of coaching experience in the grant. Slight modifications were made after beginning use, and the tool has now been used for 3 years.

Coaches establish rapport with teachers

95.2% of 62 teacher survey respondents reported “I can confide in my coach.”

• Trust between the coach and teacher(s) is critical:

• To the provision of CAP implementation support • Pre-conference meeting • CAP Observation – Videotapes for use to train teachers, coaches, evaluators – Co-teaching; modeling • Post observation conference • To the effective and strategic selection of CRC & supplemental resources

“She has been so available for me as a literacy coach… I don’t know what I would have done without her assisting me…we’re just basically working as a great team together.” (MSRP Focus Group Report, 2007)

The Coaching Cycle: 1. Teacher attends CAP modeling or discussion 2. Teacher discusses lesson plan with coach 3. Coach observes teaching rehearsal 4. Debrief/revise lesson plan 5. Coach observes performance teaching 6. Final debrief

“Equipping middle and high schools with trained literacy coaches is at least one line of attack to combat “the quiet resignation that seems to pervade education circles…that little if anything can be done” ( Joftus, 2002, pg. 1 ).

Content Teachers Support Literacy Strategies

“ I think literacy is so important because no matter what they do or where they go they are going to run into something they have to read… if they have to apply for a job, it requires them to be able to read…” (2007)

Purpose of our Study

Implementation

: What coaches do Effect on Student Achievement

Impact

: Effect on pedagogy Effect on student achievement

Implementation Questions

• What daily tasks do literacy coaches typically perform at the middle school level?

• How much of their time is spent involved in substantive tasks that support teacher practice?

• To what extent do teachers perceive coaching services as beneficial? • What are some of the challenges that coaches face?

Impact Questions

• To what extent has working with a coach improved teachers’

pedagogy

?

– Do those who received literacy coaching report higher frequency of strategy use?

– How prepared do teachers feel to use literacy strategies in their content classes?

• $64,000 question: To what extent has literacy coaching for teachers increased students

’ academic achievement

?

Data Sources • Measuring what coaches do

: – Coaches’ daily activity logs (N = 847) – Teacher surveys (three waves: N = 48, 62, and 54) – Teacher focus groups: four waves of 30 sessions – Coach interviews (four waves with six coaches)

Data Sources • Measuring teacher impact

: – Baseline and follow-up survey: MCLA completers and control group – Focus group interviews – Program feedback survey (three waves) – Follow-up checklist six months after program ended

• Measuring student impact

: – TCAP reading (Spring 2007 and Spring 2008) – ITBS reading (Spring 2007 and Spring 2008)

Coach Log Analysis

• Coaches’ logs represented from 52% to 86% of their 190-day work year • We entered a total of 5,791 individual records from 847 daily activity logs • Tasks fell into 12 overarching categories including observing, modeling, helping a teacher prepare for class and administrative tasks or school-related activities

Time Spent with Teachers

• Two months into the 2007-08 school year, almost 60 percent of respondents reported that they had met with their coaches more than four times. • By spring 2008, three-quarters (75.9%) reported that they had met with their coaches more than four times. These figures are corroborated by data in the coaching logs.

Coaching “Dosage”

• Analyses of logs showed that all (100%) MCLA completers in two schools received high levels of coaching assistance • Approximately three-quarters (76.9%) received high levels of coaching at the third school • One-third (35.7%) of MCLA completers received high levels of coaching at the fourth school

Teacher Surveys: Fall 2006, Fall 2007, and Spring 2008

Data source

: RBS MCLA Feedback Surveys

Data source

: RBS MCLA Feedback Surveys

Nature of Coach-Teacher Collaboration: Focus Group Findings: Wave One

Teachers shared universally positive perceptions about coaching support. Stated one teacher:

“She’s there. She’s not intrusive. She never comes off as being a judge, a threat. If she comes in, and if she sees something that wasn’t going like it should, she would offer advice, tips, as opposed to ‘Well that wasn’t right’ and leave. She would say ‘Maybe you should try something like this.’ ”

Despite initial growing pains related to scheduling issues, coaches were highly valued:

“My coach tends to be hard to find sometimes… But she’s very helpful I’ve always found… I’ve had some struggles… being a first-year teacher, and she took time out to help me plan a different lesson altogether, trying to figure out how to teach them, how to write better sentences… Actually it’s been a great resource even though it does seem she’s stretched a little too thin.”

Second Wave of Focus Groups

• Although teachers issued strong praise of coaches, few accepted her offer to model lessons because they did not feel they needed it. One science teacher stated:

“She did ask… but I told her ‘No, no. Just go over what I need to do and I’ll take care of it.’ ”

• A few teachers in a mathematics focus group said that although their coach made them feel comfortable, they did not “need” her to model a lesson because “she would explain it so well in class.”

Third Wave of Focus Groups

• Strong praise for coaches: very helpful, approachable, and committed to helping teachers succeed • Teachers said coaches “went out of their way” to supply them with needed materials and resources, and cited benefits from observation feedback

“She has been so available for me as a literacy coach… I don’t know what I would have done without her assisting me… We’re just basically working as a great team together.”

Third Wave of Focus Groups

• Satisfaction with the coach’s accessibility increased

“I’m very impressed with [the coach] this year. Last year we were trying to figure each other out (or her role or my role or something), but this year dynamite.” – Well, every time I’d look for her last year, she wasn’t around. I’d ask for something and I couldn’t get it, but this year she is

Fourth Wave of Focus Groups

• All nine focus groups held positive views about their coach, and characterized them as very helpful • Advice to others expecting MCLA at their school: avail yourself of the literacy coach’s services

“Basically, the literacy coaches are there to help you and sometimes we as teachers, as secondary teachers, we don’t like to open our classrooms up to other people to come in and show us things.”

Focus Group Summary

• Across the 30 focus group sessions, most respondents described their coach as someone who "goes that extra mile" to provide assistance, and one who showed understanding and patience • Many teachers shared examples of coaches’ dedication • Initial concerns about accessibility and scheduling conflicts dissipated over time

Coaches’ Challenges

• Helping teachers to see that literacy strategies were not “add-ons” • Limited opportunities to see student data in schools with less principal support • Learning to mentor on-the-job • Fitting some literacy activities in with mathematics

Impact on Teachers

• This study examines matched baseline and follow-up survey data for 30 MCLA teachers and 34 control group teachers • Teachers were asked how prepared they felt to use, and how frequently they used, 24 literacy strategies

Levels of “Preparedness” and Strategy Use

• No baseline differences in mean responses about

preparedness

on 23 of 24 items • Only one difference emerged: MCLA teachers had a higher mean response (3.80) than control teachers (3.12) on

how prepared they felt

to have students read aloud for at least five minutes per period (

F

= 4.82, df = 62, p <.05) • No baseline differences between control and MCLA group with respect to reported

frequency of strategy use

Changes Over Time

• Paired

t

tests showed a significant increase in mean responses for both MCLA and control group on most

preparedness

items • ANOVA results showed significant differences for

frequency of strategy use

, favoring MCLA group: • Showing relationships with graphic organizers • Establishing purpose for reading text • Modeling use of thinking maps • Using cooperative learning groups

Six-month Follow-up: Fall 2008

• Surveys distributed to schools no longer participating in MCLA asked teachers if they had engaged in five specific literacy activities in the past week • Forty-two respondents had completed at least one semester of MCLA • 83 percent of these respondents identified an MCLA activity they had used in the past week

Data source:

Fall 2008 RBS WIS Checklist

Student Impact

• This study includes 3,612 with baseline and follow-up TCAP test scores • 1,830 students were linked to the 30 MCLA teachers • 1,782 students were linked to the 34 control teachers • Baseline mean number correct were higher in control schools than MCLA schools (

F

= 5.44, df = 3411, p<.05) but scale scores were not significantly different

TCAP Baseline Reading Scores

CONTROL (N=1,667) MCLA (N=1,745)

* [Standard Deviation]

Mean number of items correct ‘06-07*

[Standard Deviation] *(

F

= 5.44, df = 3411, p<.05)

Mean Performance Level

[Standard Deviation] 507.52

[32.5] 36.68

[11.0] 2.02

504.24

[31.8] 35.25

[10.4] 1.98

TCAP Follow-up Scores (N=3,520)

CONTROL (N=1,735) MCLA (N=1,785) SCALE SCORE 2007-08

[Standard Deviation]

Mean number of items correct ‘07-08

* [Standard Deviation] * (

F

= 4.97, df = 3518, p <.05) Mean

Performance Level

[Standard Deviation] 518.19

[33.4] 36.11

[10.48] 2.08

[.56] 514.21

[33.4] 34.66 [10.05] 2.02

[.55]

ITBS Scores

Number of students

with baseline score

CONTROL

N = 899

Total Reading NCE Mean Score

Spr. ’07 29.2

[Standard Deviation] [15.76]

Number of students

with follow-up score

Total Reading NCE Mean Score

Spr. ’08 [Standard Deviation] N = 759 31.3

[16.84]

MCLA

N = 963 26.6

[15.32] N=802 28.3

[15.83]

Overall Student Achievement Results • Although TCAP scores were higher among control students, the magnitude of the difference is very small:

two points

• ITBS scores followed the same pattern • Initial ANOVAs and linear regression results did not show a positive MCLA impact on tests scores; however, more variables must be added to the model • Did MCLA teachers have more challenging students? Were there more behavioral problems in MCLA schools?

Qualitative Findings about Student Impact • Teachers were generally optimistic that using literacy strategies would improve student achievement

“My students have definitely improved a lot. I have a couple of students who haven’t, but I’ve had students who’ve already like, over two years of growth by their mid year assessment...” “I think some of the strategies have given them– they want to do things. They’re not as apprehensive as they once were, especially when it comes to fluency

.”

Qualitative Findings about Student Impact • On average, teachers felt that learning the literacy strategies helped most students to read better; however, several expressed concern that “nonreaders” needed additional help.

“Some have improved, but if they are nonreaders, they’re still nonreaders. It did not help. But those that were struggling, it gave them a different avenue to use, a different method, a different strategy.”

Conclusions • Although student-level results showed no effect of MCLA on academic performance in one year, teacher findings suggest an increased use in literacy strategies • Findings mirror those of other educational researchers who have examined the impacts of coaching models on teaching and learning (Murray, Ma, and Mazur, 2008)

Contact Us:

• Kelly Feighan: [email protected]

• Elizabeth Heeren: [email protected]

• Or visit our website at http://www.rbs.org/msrp