A Systems Approach to Planning Evaluations

Download Report

Transcript A Systems Approach to Planning Evaluations

Cornell University
Cornell Office for Research on Evaluation (CORE)
A Systems Approach to Planning Evaluations
The SEP and the Netway
William M. Trochim
Cornell University
Presentation to the Centers for Disease Control and Prevention
May 27, 2014
Overview
• The importance of program models
• Developing high-quality program models
– Need a “protocol” – step-by-step
– Need (web-based) tools
• The Systems Evaluation Protocol
• The Netway Cyberinfrastructure
• Examples: CDC’s Logic Models for FOAs
• How good models lead to good evaluation
The Importance of Program Models
A program model
provides a “high-level
view
Helps multiple
stakeholders see multiple
perspectives
Importance of Program Models:
Link Planning, Management and Evaluation
Importance of Program Models:
Link Planning, Management and Evaluation
Developing high-quality program models
Need a step-by-step
guide:
A “protocol”
Systems Evaluation
Protocol:
SEP
Need practical tools:
Preferably web-based
The Netway
Cyberinfrastructure
The Systems Evaluation Protocol (SEP)
https://core.human.cornell.edu/
Developing the Program Model
Developing the Program Model
1.
2.
3.
4.
5.
6.
7.
8.
9.
Stakeholder Analysis: Determine all of the potential people and/or organizations that may have
a stake in the program.
Program Review: Gain a firm understanding of the components and characteristics of the
program including how it operates and whom it serves.
Program Boundary Analysis: Determine the conceptual limits of the program; what is “in” and
what is “out” when defining the program.
Lifecycle Analysis: Determine the maturity of the program and how its level of evolution
influences evaluation capacity and method choices.
Logic Model: Generate an initial logic model including the assumptions, context, inputs,
activities, outputs, short-, medium-, and long-term outcomes.
Pathway Model: Use the logic model as a basis for articulating clear and direct linkages
between program activities and outcomes.
Evaluation Scope: Determine the specific components of the pathway model that will be the
focus in the upcoming evaluation cycle.
Program-System Links: Introduce tools and strategies for finding similar programs and shared
outcomes, develop research support by drawing on literature and on resources in the systems
within which the program exists
Reflection and Synthesis: Finalize the logic and pathway models including reviewing the
program logic model, assessing the model from the perspectives of key stakeholders, reviewing
the Program Boundary Analysis, reviewing the Program and Evaluation Lifecycle Analyses, and
revising the models as needed. This step also involves integrating relevant research literature as
it relates to the causal pathways that have been articulated in the Pathway Model.
Developing the Program Model: Outputs
•
•
•
•
•
•
Map of Stakeholders
Program Description
Lifecycle Charts
Program Logic Model
Program Pathway Model
Supporting and Background Literature
Stakeholder Analysis
Other Youth
Programs
NYS 4-H
SUNY
Morrisville
Cobleskill
Cornell
University
Local School
Districts
CCE Staff
Taxpayers
Youth
Funders
JCADCA
Volunteers
Local Ag
Businesses
Breed
Associations
CCE Board
of Directors
Dairy
Program
NYS Jr. Holstein
Association
FFA
Teachers
4-H Members
Jefferson
County Fair
Board
State Fair
Parents
Surrounding
County
Youth
Jefferson
County Dairy
Producers
National Dairy
Industry
Jefferson
County
Legislatures
Media
4-H Dairy Program: Stakeholder Map
Logic Models
LOGIC
MODEL
Lots of LM Formats out there…
http://www.csrees.usda.gov/business/reporting/part/gen_logic_model.pdf
LM Formats: United Way
LM Formats: University of Wisconsin Extension
LM Formats: USDA Nat’l Institute for Food and Ag
http://www.csrees.usda.gov/business/reporting/part/gen_logic_model.pdf
How do we build a Logic Model?
• The parts of a Logic Model
– Assumptions
– Context
– Inputs
– Outputs
– Short-term Outcomes
– Mid-term Outcomes
– Long-term Outcomes
• Need a guide to putting these together
CORE’s “Logic Model Template”
CORE’s “Logic Model Template”
CDC Examples
Epidemiology and Laboratory Capacity (ELC) for
Infectious Diseases Logic Model
Comprehensive Asthma Control Logic Model
National Tobacco Quitline Logic Model
The Netway
• Web-based software developed by CORE
• Facilitates program modeling and evaluation planning
• Increases shared knowledge and identifies connections
among programs
Exercise 1: Critiquing a Logic Model
•
•
•
•
•
Entering a Logic Model in the Netway
Review the example CDC Logic Model
Look at the checklist in row 3 of the Guideline
Pair up
Discuss how well each of the checklist elements
are met in the logic model
• Group discussion
From Logic Models to Pathway Models
• The problem with columnar logic models
– Don’t provide enough detail
– Don’t show cause-and-effect
– Don’t show pathways - throughlines
– Don’t highlight “key” nodes
– Don’t effectively tell the story
• How do we construct a pathway model?
“Mining the Model”
“Mining the Model”
“Mining the Model”
Exercise 2: Drawing Pathways
• Use a CDC Logic Model
• Pair up
• Identify one key pathway – the one you think is
most critical
– Activity  ST Outcome  MT outcome  LT outcome
• Draw the pathway on the model
• If time, identify one or two additional pathways
• Enter pathways into Netway
Key Features of a Pathway Model
Key Nodes
Throughlines
Exercise 3: Identifying Key Pathway Features
•
•
•
•
Look at the logic model(s) and pathway model(s)
Identify the two or three likely key nodes
Identify the two or three likely key pathways
What do these suggest about the program?
A More Detailed Example
Activities
Short Term
Outcomes
Mid Term
Outcome
s
Long-Term
Outcomes
“Building Evaluation Capacity in CCE
System and Programs, 2012-15”
Four Key Objectives of the Project
Four Key Objectives of the Project
… and why they are so critical
Key “milestones” in the project
How do we do it?
Activities
Short Term
Outcomes
Mid Term
Outcome
s
Long-Term
Outcomes
The big picture for Evaluation
Partnerships based around the CCE
Plans of Work
Activities
Short Term
Outcomes
Mid Term
Outcome
s
Long-Term
Outcomes
Contribution of “Lead Evaluators” and
enhanced Evaluation Literacy across
CCE
Activities
Short Term
Outcomes
Mid Term
Outcome
s
Long-Term
Outcomes
Essential additional component:
Comprehensive Evaluation Policy
Activities
Short Term
Outcomes
Mid Term
Outcome
s
Long-Term
Outcomes
“Building Evaluation Capacity in CCE
System and Programs, 2012-15”
What else do we do with Pathway Models?
• Improve quality of Pathway Model
– Incomplete pathways: some activities and
outcomes omitted
– Leaps: jumps from activities to MT or LT
outcomes
– Orphans:
• outcomes that have no activities
• activities that don’t lead anywhere
• Link to Evidence base:
– Literature for nodes (examples or measures)
– Literature for paths
Wrapping up Modeling Stage
• What I didn’t cover:
– Program boundary analysis
– Lifecycle analysis
– Evaluation scope
From Program Model to Evaluation
A “systems approach” to evaluation
Our systems approach draws on systems thinking,
complexity theory, evolutionary theory, …
Static vs. Dynamic
Processes
Whole is more
than sum of parts
Symbiosis
Phylogeny
Ontogeny
Implications for evaluation
• Programs are viewed (and modeled) as
 parts of larger systems
 dynamic and evolving
 related to other programs in the present
 connected to past and future programs
 being perceived differently by different
stakeholders and systems
• Evaluation plan development takes this into
account and should help programs and systems
evolve
What leads to a high-quality Evaluation Plan?
1. Consistency with a high-quality program model
2. Fitness of evaluation plan elements to the
program and program context
3. Internal alignment of the evaluation plan
elements
4. Holistic coherence
1. Consistency with a high-quality program model
•
A “high-quality model” …
– is grounded in knowledge of the program
– incorporates perspectives of multiple stakeholders
– shows causal pathways (program logic)
– reflects careful thought about program “boundaries”
– includes program assumptions and key elements of
context
– is connected to program evidence base (relevant
research)
• “consistency with…” means
– evaluation questions can be located in terms of
model elements
– evaluation “scope” makes sense
2. Fitness of evaluation questions/plan to the
program and program context
•
•
•
•
Evaluation questions are “mined” from the model
Evaluation questions are appropriate for the program’s
maturity and stability, existing state of knowledge, and
program needs
Evaluation focus, methods, and tools meet needs of
key stakeholders
Evaluation plan makes efficient and strategic use of
program and evaluation resources
3. Internal alignment of the evaluation plan
elements
•
•
•
•
•
Measures fits the constructs
Measure is the most strategic option among those that
fit
Design is appropriate for lifecycle stage
Design can support claims implied in purpose
statement
Sampling and Analysis plans can generate evidence
needed
4. Holistic coherence
•
•
•
•
The most difficult element
Evaluation planning requires myriad decisions about
multi-faceted tradeoffs
Making these decisions well requires a holistic
comprehension of the program and the environment
and systems that embed it
The decisions may be invisible in the written plan so
some of the resulting quality might be ascertainable,
some of it will not be
Positive Feedback
There was a huge
improvement in
the ability to
design programs.
It became much
easier to more
clearly and more
quickly articulate
impacts.
[This process] required
a different kind of
thinking. Not the “seat
of the pants” kind of way
of working at things that
we have done … but
really logically thinking
through all the steps.
It’s a lot of work, and you have
to be prepared to set aside time
in your busy life. But I think it
was really valuable.
It's been very good for
clarifying what we
want to accomplish for
funding purposes.
… now we can really
identify and comment on
the end products of our
evaluations in the form
of hard data that we can
bring to grant meetings
and include in
proposals.
It makes you think
more critically
about what you're
doing…”.