Transcript Document

Session 3: Analysis and reporting
Collecting data for cost estimates
Jack Worth (NFER)
Panel on EEF reporting and data archiving
Peter Henderson, Camilla Nevill, Steve Higgins
and Andrew Bibby
Data collection for cost
estimates
6th June 2014
Jack Worth, NFER
EEF London Evaluators Workshop
Summary
• Reporting cost is an important part of
evaluations
• Collecting the right information can be
theoretically and practically challenging
• Evaluators should be sharing experiences
and best practice
Effectiveness
Cost effectiveness
Costs to consider collecting
• Direct costs – how much would it cost a school or
parent to buy the intervention?
– how much did it cost may differ
• Indirect costs – do staff have to put in more hours
than usual?
– always think: what is the counterfactual?
• Needs to be quantitative information
Collecting cost information
• Impact or process evaluation?
– cost effectiveness relates to impact...
– ...but process methods e.g. surveys, case studies, often
better suited to cost data
• Planning and communication
– across evaluation project team
– with development partner
Reporting cost information
• Present the average cost to compare with average
effectiveness
• Cost per pupil or per school?
– may depend on the specific intervention
– cost per pupil is comparable to other interventions
• Present the assumptions made and data sources
used
Sharing best practice
• Recommend agreeing principles for a common
approach among evaluators
• Questions?
NFER provides evidence for excellence
through its independence and insights,
the breadth of its work, its connections,
and a focus on outcomes.
National Foundation for Educational Research
The Mere, Upton Park
Slough, Berks SL1 2DQ
T: 01753 574123
F: 01753 691632
E: [email protected]
www.nfer.ac.uk
EEF reporting and data archiving
Peter Henderson (EEF)
Camilla Nevill (EEF)
Steve Higgins (Durham) - Chair
Andrew Bibby (FFT)
The reporting process and publication of
results on EEF’s website
Peter Henderson(EEF)
Reporting process
#
Stage
Notes
1
Evaluation team submits report to EEF by date set
out in evaluation contract.
Please submit your reports on time. If
for any reason you do not think it will
be possible to submit your report on
time, please inform the EEF as soon
as possible. Late submission of
reports may affect future evaluation
awards.
2
EEF will check report has required sections,
following CONSORT template.
If there are clear omissions, reports
returned to evaluation teams.
3
EEF sends report for peer review.
4
Peer review comments sent to the evaluation team.
5
Evaluation team responds to comments from peer
reviewer and returns report to the EEF.
6
EEF sends report to developer for comments, and
edits report.
7
Edited report sent to evaluator, incorporating
developer comments where appropriate.
8
Evaluation team and EEF work together to agree
final version of report.
9
Report ready for publication.
10
Report published.
Evaluation team
Peer review will be undertaken by
members of the Evaluation Panel or
EEF Evaluation Advisory Group.
The aim of this stage is ensuring that
the evaluation report is accessible and
clear to practitioners as possible.
Dissemination team
Reports will be published in batches at
several points per year.
Classifying the security of findings from EEF
evaluations
Camilla Nevill (EEF)
Group
Number
of pupils
Effect size
Estimated months’
progress
Literacy intervention
550
0.10 (0.03, 0.18)
+2
www.educationendowmentfoundation.org.uk/evaluation
Evidence strength
Example Appendix: Chatterbooks
Rating
1. Design
2. Power
(MDES)
3. Attrition
4. Balance
5. Threats
to validity
5
Fair and clear experimental
design (RCT)
< 0.2
< 10%
Well-balanced on
observables
No threats to
validity
4
Fair and clear experimental
design (RCT, RDD)
< 0.3
< 20%
3
Well-matched comparison
(quasi-experiment)
< 0.4
< 30%
2
Matched comparison
(quasi-experiment)
< 0.5
< 40%
1
Comparison group with
poor or no matching
< 0.6
< 50%
0
No comparator
> 0.6
> 50%
Some
threats
Imbalanced on
observables
Significant
threats
Combining the results of evaluations with the
meta-analysis in the Teaching and Learning
Toolkit
Steve Higgins (Durham)
Archiving EEF project data
Andrew Bibby
Prior to archiving…
1. Include permission for linking and archiving in consent forms
2. Retain pupil identifiers
3. Label values and variables
4. Save Syntax or Do files