Developing An Evaluation Plan For TB Control Programs
Download
Report
Transcript Developing An Evaluation Plan For TB Control Programs
Developing An
Evaluation Plan
For TB Control Programs
Division of Tuberculosis Elimination
National Center for HIV, STD, and TB Prevention
Centers for Disease Control and Prevention
0
Developing An
Evaluation Plan
For TB Control Programs
Reference: A Guide to Developing an Evaluation Plan
1
Why Develop an Evaluation Plan?
Provides a cohesive approach to conducting
evaluation and using the results
Guides evaluation activities
Explains what, when, how, why, who
Documents the evaluation process for all
stakeholders
Ensures implementation fidelity
2
Guide to Developing An
Evaluation Plan
Document referenced throughout
presentation
Provides a template and instructions to help
TB program staff develop an evaluation plan
Steps to evaluation are explained in detail
Completing sections and tables will result in
an evaluation plan
3
The CDC Program
Evaluation Framework
Steps
Engage
stakeholders
Ensure use
and share
lessons learned
Justify
conclusions
Standards
Utility
Feasibility
Propriety
Accuracy
Describe
the program
Focus the
Evaluation
design
Gather credible
evidence
4
The CDC Program
Evaluation Framework
Systematic method for evaluation
Based on research and experience
Flexible and adaptable
Promotes a participatory approach
Focuses on using evaluation findings
5
Sections of an Evaluation Plan
Introduction
Stakeholder Assessment
Background and Description of the TB
Program and Program Logic Model
Step 1: Engage Stakeholders
Step 2: Describe the Program
Focus of the Evaluation
Step 3: Focus the Evaluation Design
6
Sections of an Evaluation Plan
Gathering Credible Evidence: Data
Collection
Justifying Conclusions: Analysis and
Interpretation
Step 4: Gather Credible Evidence
Step 5: Justify Conclusions
Ensuring Use and Sharing Lessons
Learned: Reporting and Dissemination
Step 6: Ensure Use and Share Lessons Learned
7
Introduction
An introduction provides background
information, identifies the purpose of the
evaluation, and provides a roadmap of the plan.
Evaluation Goal
What is the purpose of the evaluation?
Evaluation Team
Who is your evaluation coordinator?
Who are the members of your evaluation team?
Reference: Table 1 in the Evaluation Plan Guide
8
Stakeholder Assessment
Stakeholders are individuals with vested interests in
the success of the TB program. Involving stakeholders
increases the credibility of the evaluation and ensures
that findings are used as intended.
Who are the stakeholders in your TB program?
What are their interests in the evaluation?
What role do they play in the evaluation?
How do you plan to engage the stakeholders?
Reference: Table 2 in the Evaluation Plan Guide
9
Background and Description
of the TB Program
The program description ensures that stakeholders
have a shared understanding of the program and
identifies any unfounded assumptions and gaps.
Need
What problem does your program address?
What are the causes and consequences of the problem?
What is the magnitude of the problem?
What changes or trends impact the problem?
10
Background and Description
Context
Target Population
Does your program target the TB concerns of one population?
Program Objectives
What are environmental factors that affect your program?
What objectives have been set for your program?
Stage of Development
Is this a new initiative or is it well established?
11
Background and Description
Resources
Activities
What are program staff doing to accomplish program objectives?
Outputs
What resources are available to conduct the program activities?
What are the direct and immediate results of program activities
(materials produced, services delivered, etc.)?
Outcomes
What are the intended effects of the program activities?
Reference: Table 3 in the Evaluation Plan Guide
12
Program Logic Model
A logic model is a graphic depiction of the
program description.
Arrows describe the links between resources,
activities, outputs and outcomes
A logic model
Provides a sense of scope of your program
Ensures that systematic decisions are made about
what is to be measured
Helps to identify and organize indicators
13
Program Logic Model
14
Contact Investigation
Goal: Prevent TB among contacts to cases (by finding and testing contacts for
TB and LTBI, and then treating infected contacts to completion).
1
A
B
C
D
E
Inputs
Activities
Short-term
Outcomes
Intermediate
Outcomes
Long-term
Outcomes
a Adequate infrastructure
Interview/reinterview cases
i Build rapport
ii Provide education
iii Obtain information
about source case and
contacts
i
Qualified, trained and
and motivated staff
ii Community and congregate
setting partnerships
iii Policies, procedures, and
guidelines
iv Ongoing data collection,
monitoring, and reporting
systems
v Adequate physical,
diagnostic, and treatment
resources
vi Linkages between
jurisdictions
vii Adequate data collection
tools
viii Partnership with private
providers
2
a
b
3
1
1 a
b
Locate and evaluate
contact:
i Follow-up
ii Education
iii Examination & testing*
a
Cases identify contacts
2
a
Contacts educated
3
a
Contacts evaluated
4
a
c
Contacts followed up
Offer treatment
d
Treat contact – case
management
(DOT/DOPT/incentives
1
5 a
Contacts start treatment
1
a
Contacts complete
appropriate treatment for
active TB or LTBI
2
Comprehensive
interview tool
Staff trained in interview
techniques
a
Legal mandate to collect
contact information from
congregate settings
a
Reporting
3
a
Monitor:
i Data collection
ii Data management
ii Data analysis
iv Data dissemination
6 a
Evidence-based decisions
about continuation or
termination of contact
investigation
2 a
Improved approaches for
contact investigation
a
Active TB cured in
contacts
b
TB (prevented) in
contacts with LTBI
2
a
Reduced incidence
and prevalence of TB
3
a
TB eliminated
4 a
Conduct periodic review
of cases/contacts and progress
toward contact treatment goals
15
Sample Logic Model
Project Description of TB Support Program
Resources
Activities
Initial
LHAs of
Subsequent
Short-term
Long-term
Increase utilization of
TB services by
Salvadoran
Community
Eliminate TB in
Salvadoran
community
LHAs training
LHAs hired &
trained
Provide Spanish
speaking and culturally
competent services for
Salvadoran community
Community
outreach
Education
TB outreach
and education
conducted
Increase TB knowledge
in Salvadoran
community
Counseling &
support
provided
Trust built between
health care providers
and the Salvadoran
community
Testing done
and referrals
made
Early TB and LTBI
detection and
interventions
Reduce TB
transmission
Patients accept
treatment for TB and
LTBI;
Complete treatment
TB Staff
Community based
organizations serving
Salvadoran community
Intermediate
Hiring LHAs
Salvadoran community
Funding
Outcomes
Outputs
TB screening/
testing
Patients offered
treatment
Develop
treatment plans
DOT visits
Treatment
plans
Increase completion
of therapy rate
Patients adhere to
treatment
( reduce hospital
admissions for TB
among Salvadorans)
DOT
administered
16
Focus of the Evaluation
Since you cannot feasibly evaluate
everything, you must focus the evaluation by
prioritizing and selecting evaluation
questions.
Stakeholder Needs
Who will use the evaluation findings?
How will the findings be used?
What do stakeholders need to learn/know from the
evaluation?
17
Focus of the Evaluation
Process Evaluation
What resources were required?
What program activities were accomplished?
Were they implemented as planned?
Outcome Evaluation
Is the program producing the intended outcomes?
Is there progress toward program objectives and goals?
18
Focus of the Evaluation
Evaluation Questions
Based on the needs of your stakeholders
Address process and outcome
Assess Your Questions
Feasible to collect
Provide accurate results
19
Focus of the Evaluation
Key
Issues in Evaluation Design
Will you have a comparison or control group?
When will you collect data?
Will the data be collected retrospectively or
prospectively?
What type of data do you need?
What data do you have already?
20
Focus of the Evaluation
Other Design Considerations
Standards for “good” evaluation
Timeliness
Stage of development
Data needed
Strengthen Your Design
Mix methods whenever possible
Use repeated measures
Triangulate
21
Gathering Credible Evidence:
Data Collection
Identify indicators, standards, and data sources
to address evaluation questions.
Indicators
Visible, measurable signs of program performance
Reflect program objectives, logic model and evaluation
questions
Program Benchmarks and Targets
Reasonable expectations of program performance
Benchmarks against which to measure performance
Reference: Table 4 in your Evaluation Plan Guide
22
Gathering Credible Evidence:
Data Collection
Linking evaluation questions, indicators and program benchmarks.
Example from the Guide – Table 4.
Evaluation
Process and Outcome Indicators
Program Benchmarks
Question
Have Spanish-speaking
persons been treated
appropriately for LTBI or
TB?
Number of Spanish-speaking persons treated
by clinic for TB & LTBI between Jan – June.
Increase in the number of Spanishspeaking patients
Number of times clinical treatment standards
are met for Spanish-speaking patients
Clinical standards are met 100% of
time
Percent of time that signs and forms are
available in Spanish and written for persons
with low-literacy skills
Patient education signs and forms
in Spanish are available 100% of
time; literacy level of materials is at
a 3rd grade reading level
23
Gathering Credible Evidence:
Data Collection
Data Collection
Where are the data?
What methods will be used to collect data?
How often will the data be collected?
Who will collect the data?
Tools for Data Collection
Collect only the information you need
Easy to administer and use
Reference: Table 5 in your Evaluation Plan Guide
24
Gathering Credible Evidence:
Data Collection
Linking indicators and data sources and specifying your data
collection plan. Example from the Guide – Table 5.
Table 5. Data Collection Plan
Indicator
Data Sources
Collection
Who
When
How
25
Gathering Credible Evidence:
Data Collection
Human Subjects Considerations
Evaluation Timeline
Ensures that all stakeholders are aware of what activities are
occurring at any time
Helps to determine if your evaluation resources will be strained
by too many activities happening at once
Data Management and Storage
Ensures confidentiality and data quality
Reference: Table 6 in your Evaluation Plan Guide
26
Justifying Conclusions:
Analysis and Interpretation
Once the data are collected, analysis and interpretation
will help you understand what the findings mean for
your program.
Analysis
What analysis techniques will you use for each data collection
method?
Who is responsible for analysis?
Interpretation
What conclusions will you draw from your findings?
How will you involve stakeholders?
Reference: Table 7 in your Evaluation Plan Guide
27
Ensuring Use and Sharing Lessons
Learned: Reporting and Dissemination
A plan for dissemination and use of the evaluation
findings will avoid having evaluation reports “sit on the
shelf.”
Dissemination
What medium will you use to disseminate findings?
Who is responsible for dissemination?
Use
How, where, and when will findings be used?
Who will act on the findings?
Reference: Table 8 in your Evaluation Plan Guide
28
Tips for Evaluation Planning
Start small – focus on one initiative or program
component to start with and limit the number of
evaluation questions
Use what you already know about the program
Consider existing sources of data
Be realistic in your timeline and assessment of resources
Use the template and tables provided in the guide, adapt
as needed
Seek help with your evaluation
29
Evaluation Resources
Some Web-Based Resources
Centers for Disease Control and Prevention: http://www.cdc.gov/eval/
W.K. Kellogg Foundation: http://www.wkkf.org/Publications/evalhdbk/
University of Wisconsin Extension:
http://www.uwex.edu/ces/pdante/evaluat.htm/
Selected Publications
Connell JP, Kubisch AC, Schorr LB, Weiss, CH. New Approaches to
Evaluating Community Initiatives, New York, NY: Aspen Institute, 1995.
Patton MQ, Utilization-focused Evaluation, Thousand Oaks, CA: Sage
Publications, 1997.
Rossi PH, Freeman HE, Lipsey MW. Evaluation: A Systematic Approach.
Newbury Park, CA: Sage Publications, 1999.
Taylor-Powell E, Steele S, Douglas M. Planning a Program Evaluation.
Madison, Wl: University of Wisconsin Cooperative Extension, 1996.
30