Building Evidence in Education: Conference for EEF

Download Report

Transcript Building Evidence in Education: Conference for EEF

Building Evidence in Education: Conference for EEF Evaluators

11 th 12 th July: Theory July: Practice www.educationendowmentfoundation.org.uk

Panel session 2: All in the process

Pleasure and pain of process

July 2013

Introductions and purpose

Becky Clarkson Look at some of the issues I’ve encountered with process evaluations, and hopefully some of the positive aspects.

Overview of the five transition RCTs

Name

Chatterbooks R4R

The partner

Academic at Coventry Uni

How many schools / pupils

12 / 577 Academic and cellist 6 / 421

Length of delivery Who is actually delivering

10 weeks 10 weeks

What the intervention involves

Researchers hired especially The cellist!

2 arms – 1 enhanced Book club style meeting once a week 10 min session once a week Vocab Enrichment Bolton Council not finalised Speaking and Listening Greenford High not finalised 6 months 8 months Normal teachers TAs hired specifically ?

?

One to one tuition Perry Beeches Academy not finalised full school year Graduates hired specifically Five sessions per fortnight

What the process evaluations consist of

• Observations of training sessions • Observations of intervention sessions • Interviewing deliverers • Teacher log

The first pain

TIME

The second pain (the biggy really!)

RELATIONSHIPS •non-contactable •too interested •over-friendly

The pleasures

Personal interest Continuity

Questions or comments?

My contact details: [email protected]

Some Issues in Process Evaluation

Presentation to the EEF evaluators conference Caroline Sharp July 2013

What is a process evaluation?

Process evaluation: documents and analyses the development and implementation of a programme, assessing whether strategies were implemented as planned and whether expected output was actually produced.

Bureau of Justice Assistance (1997) quoted in EEF (2013)

What does process evaluation involve?

It: • explores the implementation development of a programme and/or • considers how the programme has been implemented and/or developed • can assess whether the programme has been implemented and/or developed as planned • considers which activities/outputs have been implemented.

Why implementation is important

Because we need to know why and how an initiative has achieved its outcomes and impact in order to draw valid conclusions and improve practice.

Interventions are rarely implemented as designed and, crucially, variability in implementation is related to variability in the achievement of expected outcomes

(Lendrum and Humphrey, 2012, p.635).

14

Some issues and challenges

• Process evaluation may be confused with qualitative evaluation • It can be difficult to establish

intended

activities and outputs • How to respond to new/developing initiatives?

• Difficult to achieve good measures of resources and costs • How to deal with issues of fidelity versus responsiveness to context?

• How to make good judgements about scaleability?

What helps?

• • • • • • A clearly defined intervention (rather than a loosely defined ‘programme’) A more established programme model A clear logic model or Theory of Change Participation in key events (e.g. observing staff training; attending project meetings) Good communication between the programme managers and the evaluation team(s) Understanding the broader context of going to scale.

Process evaluation of Changing Mindsets: challenges and lessons

Process evaluation of Changing Mindsets: challenges and lessons Changing Mindsets

• Primary school intervention (Year 5) • Managed and delivered by Portsmouth University and partners • In Portsmouth and Hampshire • Testing theory of ‘fixed’ and ‘growth’ mindsets • 2 separate interventions: Pupil; Teacher Inset • RCT with tests in English, Maths, and Mindset measures

Process evaluation of Changing Mindsets: challenges and lessons Process evaluation – general objectives

• To establish whether the programme was delivered as planned • To identify: • factors which could explain why the intervention did/didn’t work • contextual reasons for variation • Issues which should inform plans for future roll-out

Process evaluation of Changing Mindsets: challenges and lessons Design of the process evaluation

• Include experiences and perspectives of participants (project workers, teachers, head teachers, project partners) • Cover all key components of the intervention • Keep to the ‘light touch’ EEF principle • Use project’s own evaluation

Process evaluation of Changing Mindsets: challenges and lessons Process evaluation of Changing Mindsets – challenges

Not over-burdening the project or participants Gaining cooperation without jeopardising project Avoiding confusion between project team and evaluator team Fitting in with the project: timing and place Keeping up with changes to the project Staying objective

Process evaluation of Changing Mindsets: challenges and lessons Process evaluation: some lessons from Changing Mindsets

Work with the project team Don’t duplicate data collection • Use project’s own evaluation Keep it ‘light’ – don’t over-collect targeted visits & observations keep interviews short & focused Flexible methods - phone rather than face to face, email responses