Evaluation Krista S. Schumacher Schumacher Consulting.org 918-284-7276 [email protected] www.schumacherconsulting.org Prepared for the Oklahoma State Regents for Higher Education 2009 Summer Grant Writing Institute July 21, 2009
Download
Report
Transcript Evaluation Krista S. Schumacher Schumacher Consulting.org 918-284-7276 [email protected] www.schumacherconsulting.org Prepared for the Oklahoma State Regents for Higher Education 2009 Summer Grant Writing Institute July 21, 2009
Evaluation
Krista S. Schumacher
Schumacher Consulting.org
918-284-7276
[email protected]
www.schumacherconsulting.org
Prepared for the Oklahoma State Regents for Higher Education
2009 Summer Grant Writing Institute
July 21, 2009
Why Evaluate?
How will you know your project is progressing
adequately to achieve objectives?
How will funders know your project was
successful?
◦ Increasing emphasis placed on evaluation, i.e.,
U.S. Department of Education
National Science Foundation
Substance Abuse and Mental Health Services Administration
(SAMHSA)
2
Why Evaluate?
Improve the program –
◦ “Balancing the call to prove with the need to improve.” (W.K.
Kellogg Foundation
Determine program effectiveness –
◦ Evaluation supports “accountability and quality control” (Kellogg
Foundation)
◦ Significant influence on program’s future
Generate new knowledge –
◦ Not just research knowledge
◦ Determines not just that a program works, but analyzes how and
why it works
With whom is the program most successful?
Under what circumstances?
3
Why Evaluate?
WHAT WILL BE DONE WITH THE RESULTS?????
“Evaluation results will be reviewed
(quarterly, semi-annually, annually) by the
project advisory board and staff. Results
will be used to make program (research
method) adjustments as needed.”
4
What to Evaluate?
◦ Objectives
Use baseline data from need section
◦ Activities
◦ Program/research fidelity
How well program implementation or actual
research matched established protocol
◦ Attitudes
◦ Consider sorting data by demographics, e.g.,
location, gender, age, race/ethnicity, income level,
first-generation status
5
Types of Evaluation
Process
evaluation:
◦ What are we doing?
◦ How closely did implementation match the plan
(program fidelity)?
◦ What types of deviation from the plan occurred?
◦ What led to the deviations?
◦ What effect did the deviations have on the project and
evaluation?
◦ What types of services were provided, to whom, in
what context, and at what cost?
services (modality, type, intensity, duration)
recipients (individual demographics and characteristics)
context (institution, community, classroom)
cost (did the project stay within budget?)
6
Types of Evaluation
Outcome evaluation:
◦ What effect are we having on participants?
◦ What program/contextual factors were associated
with outcomes?
◦ What individual factors were associated with
outcomes?
◦ How durable were the effects?
What correlations can be drawn between outcomes and
program?
How do you know that the program was the cause of the effect?
How long did outcomes last?
7
Who will Evaluate?
External evaluators increasingly required or strongly
recommended
◦ Partners for effective and efficient programs
Collaborators in recommending and facilitating program
adjustments
◦ Methodological orientations
Mixed-methods? Quantitative only? Qualitative only?
◦ Philosophical orientations
Purpose to use evaluation to boost personal research publication
record or to help organization/effort?
◦ Experience and qualifications
Years conducting evaluations for types of organizations and types
of projects (e.g., education, health, technical research?)
Master’s degree required, PhD preferred
(or working toward PhD)
Social science background: Sociology, Psychology, Political Science
8
How much will it cost?
External evaluations cost money…period.
Some evaluators charge for pre-award
proposal development phase
Kellogg Foundation recommends 5% to
10% of total budget
Check funder limits on evaluation
Ensure cost is reasonable but sufficient to
conduct in-depth evaluation and detailed
reports
9
Two Types of Data
Quantitative
◦ Numbers based on objectives and research design
◦ What data do you need? e.g.,
Number of participants
Grade point averages
Retention rates
Survey data
Research outcomes
Qualitative
◦ Interviews
◦ Focus groups
◦ Observation
10
Methods/Instruments
How are you going to get your data?
◦
◦
◦
◦
◦
◦
◦
Establish baseline data
Pre- and post-assessments (knowledge, skills)
Pre- and post-surveys (attitudinal)
Enrollment rosters
Meeting minutes
Database reports
Institutional Research Office (I.R.)
11
Data Analysis: So you have your data, now what?
Quantitative
data
Data analysis programs:
◦ SPSS (Statistical Program for the Social Sciences), Stata, etc...
◦ Descriptive and statistical data:
# and % of respondents who strongly agree that flying pigs are
fabulous compared to those who strongly disagree with this
statement.
Likert scale
On a scale of 1 to 5, with 1 being “strongly disagree” and 5 being “strongly
agree,” rank your level of agreement with the following statement…..
t-ratio, F test, chi-square…..
◦ “Quantitative data will be assessed using Stata statistical
analysis software to report on descriptive and statistical
outcomes for key objectives (e.g., increase in GPA,
retention, enrollment, etc.).”
12
Data Analysis
Qualitative Data
◦ Data analysis programs
NVivo (formerly NUD*IST), ATLAS.ti, etc…
◦ “Qualitative data will be analyzed for overarching themes by
reviewing notes and transcripts using a process of open coding.
The codes will be condensed into a series of contextualized
categories to reveal similarities across the data.”
Contextualization – how things fit together
◦ More than pithy anecdotes
“May explain – and provide evidence of – those hard-to-measure outcomes
that cannot be defined quantitatively.” – W.K. Kellogg Foundation
Provides a degree of insight into how and why a program is successful that
quantitative data simply cannot provide
13
Two Types of Timeframes
Formative
◦ Ongoing throughout life of grant
◦ Measures activities and objectives
Summative
◦ At conclusion of grant program or research
NEED BOTH!
14
Timelines
When will evaluation occur?
◦
◦
◦
◦
◦
◦
Monthly?
Quarterly?
Semi-annually?
Annually?
At the end of each training session?
At the end of each cycle?
How does evaluation timeline fit with
formative and summative plans?
15
Origin of the Evaluation:
Need and Objectives
Need: For 2005-06, the fall-to-fall retention rate of
first-time degree-seeking students was 55% for the
College’s full-time students, compared to national
average retention rates of 65% for full-time
students at comparable institutions (IPEDS, 2006).
Objective: The fall-to-fall retention rate of full-time
undergraduate students will increase by 3% each
year from a baseline of 55% to 61% by Sept. 30,
2010.
16
Evaluation Data Collection and
Reporting Plan
Objectives
Data
collected
and
timeline
Methods for
data
collection
and timeline
Instruments
to be
developed
and timeline
Reports/
outcomes
timeline
Increase fallto-fall
retention by
3% per year
to 61%
Student
enrollment in
first fall and
second fall
within one
month of
start of
second fall
Enrollment
entered by
gender and
race/
ethnicity into
Stata within
first four
weeks of
each
semester
Enrollment
rosters
separated by
gender and
race/ethnicity
by Jan. 15,
2009
At midpoint of
each
semester
17
BEWARE THE LAYERED OBJECTIVE!
By the end of year five, five (5) full-time
developmental education instructors will
conduct 10 workshops on student
retention strategies for 200 adjunct
instructors.
18
Origin of the Evaluation:
Research Hypothesis
Good:
◦ Analogs to chemokine receptors can inhibit HIV
infection.
Not so good:
◦ Analogs to chemokine receptors can be biologically
useful.
◦ A wide range of molecules can inhibit HIV infection.
*Waxman, Frank. Ph.D. (2005, July 13). How to Write a Successful Grant Application. Oklahoma State Regents for Higher
Education.
19
Logic Models
From: University of Wisconsin-Extension, Program Development and Evaluation
http://www.uwex.edu/ces/pdande/evaluation/evallogicmodel.html
A Logic Model is……
◦ A depiction of a program showing what the program will do and what it is to
accomplish.
◦ A series of “if-then” relationships that, if implemented as intended, lead to the
desired outcomes
◦ The core of program planning and evaluation
Situation
Hungry
Inputs
Get food
Outputs
Eat food
Outcomes
Feel better
20
Evaluation Resources
W.K. Kellogg Foundation – “Evaluation Toolkit”
◦ http://www.wkkf.org/default.aspx?tabid=75&CID=281&NID=61&LanguageID=0
Newsletter resource – The PEN (Program Evaluation News)
◦ http://www.the-aps.org/education/promote/content/newslttr3.2.pdf
NSF-sponsored program
◦ www.evaluatorsinstitute.com
American Evaluation Association
◦ www.eval.org
Western Michigan University, The Evaluation Center
◦ http://ec.wmich.edu/evaldir/index.html (directory of evaluators)
OSRHE list of evaluators and other resources
◦ http://www.okhighered.org/grant%2Dopps/writing.shtml
“Evaluation for the Unevaluated” course
◦ http://pathwayscourses.samhsa.gov/eval101/eval101_toc.htm
21
Evaluation Resources
The Research Methods Knowledge Base
◦ http://www.socialresearchmethods.net/
The What Works Clearinghouse
◦ http://www.w-w-c.org/
The Promising Practices Network
◦ http://www.promisingpractices.net/
The International Campbell Collaboration
◦ http://www.campbellcollaboration.org/
Social Programs That Work
◦ http://www.excelgov.org/
Planning an Effective Program Evaluation short course
◦ http://www.the-aps.org/education/promote/pen.htm
22
Schumacher Consulting.org
In DIRE need of grant assistance?
We provide All Things Grants!™
Grant Development | Instruction | Research | Evaluation
Established in 2007, Schumacher Consulting.org is an
Oklahoman-owned firm that works in partnership with
organizations to navigate transformation and achieve
change by securing funding and increasing program success.
23