Regional Outreach Training - University of Wisconsin

Download Report

Transcript Regional Outreach Training - University of Wisconsin

Improving Local Evaluation through
Training and Technical Assistance:
Wisconsin’s Strategy
Mary D. Michaud, MPP Ellen Taylor-Powell, PhD Bonita Westover, MSPH
University of Wisconsin-Extension
Acknowledgements
The authors wish to thank staff and members of the following
organizations who are making Wisconsin's strategy for providing
local program development and evaluation possible:
Members of Wisconsin's 77 Tobacco-Free Coalitions
Wisconsin Tobacco Control Board
State of WI Department of Public Health
UW-Comprehensive Cancer Center
UW-Center for Health Policy and Program Evaluation
UW-Center for Tobacco Research and Intervention
UW-Cooperative Extension
What we will cover




Overview of our Wisconsin initiative
Typical questions we receive
Real examples of incorporating evaluation into
coalition activities
Using a logic model to guide long-range
planning as prerequisite for useful evaluation
Background
Wisconsin Tobacco Control Board
 Comprehensive program
 5 year goals
 Commitment to evaluation
 Monitoring and Evaluation Program (MEP)
MEP
MONITORING
UW-Comprehensive
Cancer Center
STATEWIDE PROGRAM
EVALUATION
UW-Center for Health
Policy and Program
Evaluation
LOCAL EVALUATION
UW-Extension
Local Program Evaluation
GOAL:
Build capacity in program development and
evaluation enabling local coalitions to
effectively design, implement and assess
tobacco control programs
 What are you doing?
 What difference is it making for
reducing tobacco use?
 How do you know?
How we define evaluation capacity
Evaluation capacity is having resources and ability
to engage in evaluation that leads to learning,
program improvement and enhanced
accountability
Prerequisites: committed leadership, resources –
technical and financial, attitude that values
evaluation
How we do this: Operating principles
Empowerment/participatory approach
 Community members can learn and use planning
and evaluation concepts, techniques and
findings to evaluate themselves and their
programs to improve practice.

Coalitions conduct their own evaluation; our
professional role is one of trainer, consultant,
facilitator, coach.

Local advisory group of coalition members will
help provide direction and feedback.
Operating principles….
Evaluation value
 Important learning occurs during the process of
‘doing’ evaluation that impacts those involved
and leads to more effective programs and
enhanced outcomes

Evaluation is more than measurement, findings
and external reporting.

Value lies in learning and continuous
improvement.
Operating principles…
Practical approach

Approaches and methods will be used that are
practical, innovative and appropriate for cultural
and low-resource contexts.

Participatory adult education principles will be
applied.
Operating principles…
Mixed approach
 There are no “cookie cutter” approaches or
answers

Heterogeneity of coalitions and local contexts
demands mixed approaches and mixed
methods.

Innovation and creativity are key
Operating principles
Research base
 We will use research and best practices in
program planning and evaluation.

We hold ourselves to the same standards of
accountability in learning and use of evaluation.

We will apply the evaluation standards: utility,
feasibility, propriety, accuracy
Our logic model
UWEX
staff
Increased
valuing of
evaluation
Assess needs
and assets
Coalitions
Grant $$
Research
Develop tobacco
specific planning
and evaluation
materials
- facilitator
- members
Local DH officer
Evaluatoin
Advisory
Group
Partners
Provide training
and technical
assistance
Facilitate cross
site sharing
Work with partners to
create environment
that understands and
values evaluation
Partners: DPH;
CTRI; WTCB;
MEP;
Smokefree
Increased
involvement in
planning and
evaluation
Increased
resources
committed to
planning and
evaluation
Increased
knowledge and
ability to collect
and use data
Increased confidence
and motivation to
engage in evaluation
Increased
number of
coalitions that
demonstrate
actions of
effective
planning and
evaluation
Integration
of evaluation
into coalition
operations
Increased local
evaluation
capacity
More effective
programs
Improved outcomes
– WTCB goals
Our structure and process
Structure
 Regional collaboration
 Statewide coordinator
 5 regional specialists
 Local advisory group
Process
 Training: small-large group, face-to-face, distance
 TA: customized, individualized
 Resource development; distribution
 Partnering
Regional Collaborative Model
Wisconsin Regions
Content of our T and TA







Demystify evaluation
Logic models
Long-range planning: Focus on WTCB goals;
integrate evaluation
Stakeholder engagement
Writing outcomes (SMART objectives)
Evaluation planning: process and outcomes
Components of evaluation: Focus, data
collection, analysis and interpretation, use
www.uwex.edu/ces/pdande
Outcomes – Impact
Outputs
Inputs
What we
invest
Staff
Volunteers
Time
Money
Research base
Materials
Equipment
Technology
Partners
Activities
Participation
What we do
Whom we reach
Conduct
workshops,
1meetings
Deliver
services
Develop
products,
curriculum,
resources
Train
Provide
counseling
Assess
Facilitate
Partner
Work with
media
Participants
Assumptions
Clients
Agencies
Decisionmakers
Customers
Short Term
Medium Term
Long Term
What the
short term
results are
What the
medium term
results are
What the
ultimate
impact(s) is
Learning
Action
Conditions
Awareness
Behavior
Social
Knowledge
Practice
Economic
Attitudes
Decision-
Civic
Skills
making
Opinions
Policies
Aspirations
Social action
Motivations
External Factors
Environmental
www.uwex.edu/ces/tobaccoeval/
Assessing strengths and barriers to
evaluation capacity building





Attitudes
Involvement
Leadership
Resources
Knowledge-skills
 Logic
model
 Planning
 Focus
 Data collection
 Analysis and interpretation
 Use
Lessons learned/learning




Multiple partners with different contexts and
cultures require plenty of time for relationship
building
Great variation in coalitions: type, functioning,
resources, history, interest and abilities. Must start
with where coalition is
Staffing: need for technical expertise, adult
education, facilitation/relationship building/political
savvy
Budget uncertainties created even greater need for
constant communications and support
Lessons…






Annual deliverables resulted in an “evaluation
frenzy”
Need for long-range planning
What are tobacco “best practices”?
Not all coalitions need or can use evaluation T
and TA; analysis may be contracted out
General T and TA provides foundation but need
is for practical, “real” applications
And the learning continues…
Typical Evaluation Technical
Assistance Requests
“There is no such thing as ‘typical’.”
A Breadth of Examples of Evaluation Technical
Assistance Requests
Q: How should I evaluate the effects of TATU on
middle school kids?
A: “Let’s consider some possibilities…”



Work with High School kids to identify learning
objectives.
Develop evaluation questions based upon
learning objectives.
Engage HS kids in development of pre/post
survey instrument and/or group interviews.
Q. Could you please review this survey and give
me feedback for improvement?
A. “Could we start first with…
What is the purpose of the survey? What do
you want to learn? Who will use the resultsfor what? Are you sure a survey is the right
method?
“Let’s cover some tips for improving”
surveys…





Use agency letterhead to improve credibility.
Introduce survey – Who are you? Why are you
conducting survey – its importance? How will the
information be used?
Keep it concise!
If possible, provide incentives for responding.
Make time for follow-ups
Improving Surveys…


Local program evaluation web site
 http://www.uwex.edu/ces/tobaccoeval/
 Search on “surveys”.
Program Development & Evaluation web site
 http://www.uwex.edu/ces/pdande
 Go to “Evaluation Publications”
 Go to “Quick Tips”
Q: We have conducted so many surveys in the
past year. Are there some other evaluation
things I can be doing?
A: “YES!”


Consider using qualitative methods to gain
greater depth.
Not everything needs to be evaluated
Frantic:
We have all these data & I don’t know what to do
with them!
A. “I’ll walk you through what to do”
 First,
what did you want to know/learn when
you collected this information?
 What do you hope to learn from these data?
 Are your data of good enough quality to merit
analysis? Who/how many responded? What
was the sample?
I’ll walk you through what to do…
 Are
all the data together? In one place?
Ready for analysis?
 Code
 Enter into a data management program
 Clean
 Run Frequencies and percentages
 Call me as you have more questions
Q: Excel is making me crazy!
How can I…?
A:
Contact Jenny – she is our Excel guru and can
walk you through specific issues with the
program.