Transcript Document

Evaluation – the why’s, the
what’s and the how’s
2014
Dr Basma Ellahi (PI)
Cat Crum (PhD Student)
What is evaluation?
• Evaluation is a process that critically examines a
program. It involves collecting and analysing
information about a program’s activities,
characteristics, and outcomes. Its purpose is to
make judgments about a program, to improve its
effectiveness, and/or to inform programming
decisions (Patton, 1987).
Why evaluate?
• Answer the objectives of the
scheme/project
• Demonstrate change due to
the scheme/project
• Determine effectiveness i.e.
causality or association
• Inform improvements
‘dull but important
When compared with the creative and exciting
process of conceiving and initiating a project,
evaluation can often be forgotten and be
perceived as dull! However, some form of
evaluation or formative feedback is the only
thing that will show the effectiveness of the
project
• Evaluations fall into one of two broad categories:
formative and summative.
• Formative evaluations are conducted during
program development and implementation and
are useful if you want direction on how to best
achieve your goals or improve your program.
• Summative evaluations should be completed
once your programs are well established and will
tell you to what extent the program is achieving
its goals.
Type of Evaluation
Purpose
Formative
1. Needs Assessment
Determines who needs the program, how great the need is, and what
can be done to best meet the need. A needs assessment can help
determine what audiences are not currently served by programs and
provide insight into what characteristics new programs should have
to meet these audiences’ needs.
2. Process or Implementation Evaluation
Examines the process of implementing the program and determines
whether the program is operating as planned. Can be done
continuously or as a one-time assessment. Results are used to
improve the program. A process evaluation of a program may focus
on the number and type of participants reached and/or determining
how satisfied these individuals are with the program.
Summative
1. Outcome Evaluation
Investigates to what extent the program is achieving its outcomes.
These outcomes are the short-term and medium-term changes in
program participants that result directly from the program. For
example, outcome evaluations may examine improvements in
participants’ knowledge, skills, attitudes, intentions, or behaviours.
2. Impact Evaluation
Determines any broader, longer-term changes that have occurred as
a result of the program. These impacts are the net effects, typically
on the entire school, community, organisation, society, or
environment.
Adapted from Norland (2004), Pancer and Westhues (1989) and Rossi et al. (2004).
Aims of evaluation
• How has the pilot been received by the different
participants in each of the health communities/
• What have been the successes and the issues –
lessons learned?
• How successful has the pilot been in moving the
health communities towards working on the
basis of the 5 key principles?
• How successful has the pilot been in helping the
health communities meet their objectives?
Evaluation Objectives of the Malnutrition
Prevention Pilot Programme
• Confirming
outcomes and
impact indicators
• Identify unintended
consequences of
implementation
• Inputs and activities
• Generate evidence
base
• Appraising progress
• Identify limiters
• Use self generated
data
Methodology
Impact Evaluation Framework
Theory of change
Logic Model
Linkages between inputs, activities, outputs
and outcomes
Evaluation Plan
• Why?
• How?
• What?
• When?
• Who?
• Where?
Complex Evaluation
Range of activities
- One-off information
and/or taster session
- Promotional stand
- Promotional Campaigns
- Health days
- Demonstrations
- Cooks and eat
- Community Enterprise
- Training courses
• Combination of
evaluation methods
• Qualitative
• Quantitative
The tools of the evaluator
• Quantitative
• Monitoring Information
• Questionnaire/survey
• Experimental evaluation – RCT, case control, cohort etc
• Qualitative
•
•
•
•
•
Observation
Interviews
Focus Groups
Case study
Documentation
Methods
• Semi-structured interviews
(telephone)
• Action Learning sessions
• Audit - baselines and
MUST
• Sampling strategy –
purposive sampling
• Pluralistic model
• Ethics
• Frame of reference
What does good look like?
• Good evaluation is tailored to your
program and builds on existing evaluation
knowledge and resources.
• Good evaluation is inclusive
• Good evaluation is honest.
• Good evaluation is replicable and its
methods are as rigorous as circumstances
allow.
Common dilemmas
• Intellectual property rights / Data
protection and data sharing
• Be conscious of multiple roles
• Follow informed-consent rules
• Respecting confidentiality and
privacy
• Ethics
• Complexity of data collection sites
How do I make evaluation an integral part of my
program?
• Making evaluation an integral part of your
program means evaluation is a part of
everything you do. You design your
program with evaluation in mind, collect
data on an on-going basis, and use these
data to continuously improve your
program.
To build and support an evaluation
system:



Couple evaluation with strategic planning.
Revisit and update your evaluation plan
and logic model to make sure you are on
track.
Build an evaluation culture
What are the benefits?



better understand your target audiences' needs and how
to meet these needs
design objectives that are more achievable and
measurable
monitor progress toward objectives more effectively and
efficiently

learn more from evaluation

increase your program's productivity and effectiveness
10 reasons to evaluate your
project
So you know whether it’s working
So you can be adaptable
To know how things are working
So you’re aware of unintended
outcomes
5. To be able to better communicate the
value of your work
1.
2.
3.
4.
10 reasons to evaluate your
project cont.
6. To focus your work
7. To help look after the people you work
with
8. Build organisational resilience
9. Know why things are working
10. Life is complicated
Make evaluation part of
your program; don’t tack
it on at the end!
Resources
• Magenta Handbook https://www.gov.uk/government/uploads/syste
m/uploads/attachment_data/file/220542/mage
nta_book_combined.pdf
• The Magenta Book: guidance notes for
policy evaluation and analysis http://www.civilservice.gov.uk/wpcontent/uploads/2011/09/the_complete_mage
nta_book_2007_edition2.pdf
Contact details
Centre for Ageing Studies
Faculty of Health and Social Care
University of Chester
Riverside Campus
Chester
Dr Basma Ellahi – Reader in Food and Nutrition
Email: [email protected]
Professor Paul Kingston – Director of The Centre for Ageing Studies, Professor of Mental Health
and Ageing.
Email: [email protected]
Cat Crum (PhD Student – sponsored by Age UK South Staffordshire)
Email: [email protected]