Crime Prevention: Intervention and Evaluation

Download Report

Transcript Crime Prevention: Intervention and Evaluation

Chris Fox
Dan Ellingworth
Applied Policy and Practice Research Unit
‘Modern policy making’
 Forward looking: Defining policy outcomes and taking a long term view
 Outward looking: Take account of national, European and international situation;







learning from experience of other countries; recognising regional variations.
Innovative, flexible: Questioning established ways of dealing with things, encouraging
new and creative ideas, identifying and managing risk.
Joining up: Joining up the work of different government departments; ensuring that
implementation is part of the policy process.
Inclusive: Consulting those responsible for implementation and those affected by the
policy; carrying out an impact assessment
Evidence based: Basing policy decisions and advice upon the best available evidence;
ensuring evidence is available in an accessible and meaningful form.
Evaluated: Systematic evaluation of the effectiveness of policy.
Reviews: Policy constantly reviewed to ensure it is dealing with problems it was
designed to solve.
Lessons learned: Learning from experience of what works and what does not
Cabinet Office (1999) Professional policy making for the twenty-first century London: Cabinet Office
What is evidence-based policy?
• Evidence-based policy (EBP). An approach that:
– “helps people make well informed decisions about policies,
programmes and projects by putting the best available
evidence from research at the heart of policy development
and implementation” (Davies 1999 quoted in Davies 2004)
• Contrasted with opinion-based policy which:
– “relies heavily on either the selective use of evidence (e.g.
on single studies irrespective of quality) or on the untested
views of individuals or groups, often inspired by ideological
standpoints, prejudices, or speculative conjecture.” (Davies
2004)
Opinion-based policy
Taken from Chalmers, I. (2003) ‘Campbell and Cochrane: the need for generosity of
spirit and mutual support’ Jerry Lee Lecture: 3rd Annual Campbell Colloquiem
Opinion based policy
Taken from Chalmers, I. (2003) ‘Campbell and Cochrane: the need for generosity of spirit and mutual support’ Jerry Lee Lecture: 3rd
Annual Campbell Colloquiem
What factors influence policy?
Taken from Davies, P. (2004) Is evidence-based government possible Jerry Lee Lecture 2004
An evaluation framework
 Should it work? (Theory of change)
 What is the underlying ‘theory of change’ which explains why the
intervention will make an impact?
 Can it work? (Implementation evaluation)
 Has the project been properly implemented? What were the
challenges to implementation and how were they overcome?
 Does it work? (Impact evaluation)
 What is the impact of the intervention?
 Is it worth it? (Economic evaluation)
 What are the resource implications of implementing the
intervention and what benefits will it deliver?
Adapted from Haynes, B., 1999. BMJ; 319:652-653 ( 11 September )
Overview of possible methods
Economic analysis
Surveys of those affected
Interviews with service users
Capture / analysis of case files
or other administrative data
Capture / analysis of
monitoring data
Observation of key processes
Staff interviews
Review of international
research literature
Interviews with key
stakeholders
Review of project literature
Can it work?
•
Is it worth it?
•
Does it work?
•
•
•
•
•
•
•
•
•
•
•
•
Should it work?
Should it work? Theories of
change
 Sometimes programmes and projects:
 don’t have clear aims or change their aims over
time
 are ‘shoe horned’ into a set of funding criteria to
access that funding
 don’t have clearly defined processes and
structures
 are implemented in a multi-agency context where
different agencies will have different ideas about
what the project is trying to achieve
Theories of change
 “What is the conceptual link from an
intervention's inputs to the production of
its outputs and, subsequently, to its
impacts on society in terms of results and
outcomes?”
(United Kingdom Evaluation Society Glossary of evaluation terms http://www.evaluation.org.uk/Pub_library/Glossary.htm)
Scared straight
 Started as US programme in 1970s
 Target group: At risk or delinquent young people
 Prison visits including tour and confrontational meeting
with serving prisoners
 Graphic depiction of life in prison
 TV documentary in US extols its virtues
 Also used in US
Two possible theories of change
Visit to a
Prison by
juveniles
First Hand
Experience
of Prison Life
Exposure to
Prison Life
and Prisoners
as Negative
Role Models
Frightens or
Scares
Juveniles
Away from
Crime
Reduces
Crime and
Offending
or . . .
Visit to a
Prison by
juveniles
First Hand
Experience
of Prison Life
Exposure to
Prison Life
and Prisoners
as Positive
Role Models
Stimulates or
Attracts
Juveniles
Towards
Crime
Increases
Crime and
Offending
Can it work? Process evaluation
 Identify whether the intervention has been
implemented as intended
 Explore how a service or policy is delivered
and experienced in practice
 Identify the mechanisms by which it can
produce the desired effects, potential barriers
and facilitators
 Identify circumstances under which successful
operation might be replicated
Scurvy
 James Lind (1716 – 1794): Ship’s Surgeon
 Lind selected 12 men, all suffering from scurvy
 Divided them into six pairs, giving each group different
additions to their basic diet.
 Cider
 Seawater
 A mixture of garlic, mustard and horseradish
 Spoonfuls of vinegar
 Oranges
 Lemons.
 Those fed citrus fruits experienced a remarkable recovery
 In 1753, he published 'A Treatise of the Scurvy‘
http://www.bbc.co.uk/history/historic_figures/lind_james.shtml
Does it work? Impact evaluation
Scale
5
Impact evaluation methodology
Necessary conditions
Random Control Trial: Random assignment and
Ability to randomly assign participants/non-
analysis of comparable units to program and
participants to projects
comparison groups.
4
Matched pairs: A comparison between multiple units
Close match between group of programme
with and without the intervention; or using comparison
participants and non- participants
units that evidence only minor differences.
3
Multivariate model: A comparison between two or
Existence of comparable group of non-participations
more comparable units of analysis, one with and one
without the intervention, where there are differences in
the relevant characteristics of the units.
2
Before/after or time series analysis: temporal
Ability to measure before and after intervention
sequence between the intervention and the measure.
1
Correlation: Correlation between an intervention and a
Availability of data to show correlation
measure at a single pointing time.
Sherman et al. (1998) Preventing Crime. What works, what doesn’t, what’s promising.
National Institute of Justice.
Measuring impact
 Choice of impact measures is often limited by
availability of data or resources to collect primary
data.
 In the UK, re-offending data can be accessed via
the Police National Computer (PNC) or the
Offender Index
 For less tangible outcomes such as family
reconciliation or increased self-confidence the only
option is often expensive face-to-face
interviews/surveys.
Finding a comparator
 A comparator group or area may not be readily
available.
 While a programme or pilot area may be willing to
cooperate with an evaluation an organisation that is
not part of the evaluation but is being asked to
provide data for a comparator area may be hard to
persuade to cooperate.
 A regional or national roll-out may limit the number
of potential comparator areas
Throughputs
 Programmes and projects often struggle to
generate sufficient throughputs to allow for
robust statistical analysis.
 Set-up times are often under-estimated
 The need for inter-agency cooperation is
often under-estimated
Timescales
 Often evaluation timescales preclude
appropriate follow-up periods to identify the
long-term impact of interventions.


The Home Office recommends that reconviction
studies use a 2 year follow-up period.
By the time a cohort to study has been generated
this will often require an evaluation period of 3 –
4 years.
Systematic Reviews
 Systematic reviews are overviews of the existing
research literature on a topic
 A comprehensive search of print, electronic, and
unpublished sources is made.
 Sources identified are screened to see if they are
relevant.
 The quality of the sources (the strength of the
evidence) is then assessed.
 Sometimes the findings from individual studies are
pooled in a process called meta-analysis
Scared straight: systematic
review
 The systematic review shows that:
 “The program increases the percentage of the treatment
group committing new offences anywere from 1% to 30%.”
(365) (emphasis added)
Petrosino, A., Turpin-Petrosino, C., and Finckenauer, J. (2000) ‘Well-Meaning Programs Can Have Harmful
Effects! Lessons from Experiments of Programs Such as Scared Straight’ Crime Delinquency 46; 354
If so, Chris and Dan run a Third Year Module
called ‘What works in social and criminal
policy’ that looks in more detail at how policy
is made and the role of evaluation in
supporting the policy-making process.