FIPSE Program Officer Evaluation Training Program

Download Report

Transcript FIPSE Program Officer Evaluation Training Program

Evaluation for FIPSE Grantees
Karen Paulson & Shelly Potts
FIPSE Project Directors’ Meeting
Washington, D.C.
January 9, 2006
Session Outline
 Rationale for program evaluation
 FIPSE’s expectations for evaluation
 Characteristics of effective program
evaluations
 Evaluation resources for Project Directors
and Independent Evaluators
 Project Director/Independent Evaluator
Relationship
 Expectations for evaluation reporting
Rationale for evaluation
Confirm a program’s success
Monitor program implementation
Inform project activities and practices
Note unintended consequences
Identify problems and costs
Inform allocation of resources
Justify expenditure of funds
Enhance administrative planning and
policymaking
 Provide guidance about effective replication
and testing strategies








FIPSE evaluation expectations
 Formative Evaluation

Track project development & implementation
 Establish baseline information or context
 Determine usability of materials, products, etc.
 Field test materials, curricula, interventions, etc.
 Summative Evaluation
Document “value added” for learners
 Provide evidence on institutionalization, adoption/adaptation
 Describe impact on field of post-secondary education

 Controlled Comparisons

Compare program participants and non-participants
 Clarify impact & potential for benefiting other campuses
 Implement pre/post measures, where applicable
FIPSE evaluation expectations
 Design and implement a comprehensive plan





Evaluate achievement of processes, outcomes,
institutionalization, and impact
Specify data collection, analysis, and reporting
activities
Prepare an evaluation matrix & management plan
Limit to a few clear, specific, measurable
objectives
Orient measures toward student academic
behaviors
FIPSE evaluation expectations
 Methodology:




Build evaluation measures, procedures into routine activities
Use a combination of direct and indirect measures
Use multiple and mixed data collection methods
Modify evaluation plans as needed
 Process:


Use project documents & records for ongoing evaluation
Collect information on project’s cost-effectiveness
 Forward Thinking:



Collect data to demonstrate project success,
institutionalization
Consider dissemination audiences, adaptors, & their data
needs
Collect evidence on the project’s wider impact
FIPSE evaluation expectations

IMPLEMENTATION --“Did the project work the way you thought it
would?”

OUTCOMES/RESULTS -- “Did the project achieve its anticipated
outcomes?”

INSTITUTIONALIZATION -- “How will project activities & processes
be supported after the grant is over?”

WIDER USE/IMPACT -- “What evidence do we have that other
institutions are adopting/adapting the innovation?” “What impact do
the results/outcomes have on post-secondary education?”
Characteristics of effective program evaluations


Logistics:
 Use a management plan &
evaluation matrix
 Make data collection a routine
activity
 Limit to a few clear, specific,
measurable objectives
 Use existing data, procedures
 Modify evaluation plan as needed
Credibility:
 Align methods & objectives
 Use mixed & multiple methods;
multiple sources
 Use direct/indirect method
 Use credible methods/tools
 Use controlled comparisons

Process:
 Start early
 Collect data regularly
 Evaluate plan and procedures
continually
 Keep an evaluator log
 Frequent communication

Utility:
 Collect evidence needed to
demonstrate project
success/failure
 Incorporate formative &
summative components
 Orient measures toward student
learning outcomes, where
applicable
 Focus on dissemination, reporting
 Determine impact on and
contributions to field of postsecondary education
Evaluation Resources
 FIPSE website
 Evaluation website
 Evaluation resources
 Project evaluator
FIPSE Evaluation Website
(coming spring 2006)
 Purpose of the evaluation website
 Website features
 How to navigate the site?
Evaluation Plan Components















Project Background/Organizational Context
Purpose of the Evaluation
Audiences/Stakeholders
Evaluation Questions
Evaluation Approach
Data Collection Methods and Instruments
Sampling Procedures
Data Sources
Evaluation Management Matrix
Data Collection Schedule
Data Analysis/Interpretation Procedures
Budget/Cost for the Evaluation
Evaluation Constraints
Communication/Reporting Plans and Activities
What to put in the Appendices?
“Good” components are:







Included in the Evaluation Plan
Concise
Comprehensive
Specific
Give an appropriate level of description
Organized – by project goal or data source or stakeholder
Clearly link various components such as questions, goals, and
data sources
 Give rationales
 Not limited to a single approach/method/source/tool, instead
they use a variety of approaches/methods/sources/tools
Specific “Good” examples by component
 Project background – sets out and explains the




“presenting problem”
Purpose of the Evaluation – gives a good
description of the evaluation plan components related
to the project’s purpose
Audience – identifies the main stakeholders and
links deliverable skills and knowledge
gains/outcomes with stakeholder groups
Evaluation questions – logically link to project
success indicators and identify appropriate data
sources for each question
Evaluation approach – cites theory
Specific “Good” examples by component
 Sampling – indicates the type of respondents, time frame, and




process for sample selection and factors
Data Analysis/Interpretation Procedures – describes both
qualitative and quantitative procedures as well as appropriate
usage
Budget – is itemized by FIPSE budget categories by year
Evaluation Constraints – anticipates and identifies rationales
for a variety of constraints; identifies methods for avoiding,
minimizing, or overcoming potential constraints
Communication/Reporting plans: used to improve project, to
improve utility of evaluation, and to demonstrate impact of
project to internal and external audiences
Good “Data Collection Methods
and Instruments” examples:
 Provide specifics on the types of data to be collected
 Use a variety of tools and methods
 Identify appropriate tools
 Link to stakeholder groups and include how the




evaluation feedback loop will be completed
Describe the quality and rigor of instrumentation
Provide specifics about procedures
Identify timeframes
Identify sample sizes
“Could Be Improved” components are:
 Non-existent; cannot be found in evaluation plans
 Vague, hand-wavy, too general
 Maintain a broad perspective when they should be
“drilling down” to what happened, how it happened,
and why something happened
 Not specific enough (for example, what analytic
techniques will be used? What will be reported to
whom and when? How will the evaluation data be
used? What is disseminated and to whom?)
 Based on the assumption that the reader has the
same level of project knowledge as the PI/author
Specific “Could be improved” examples
by component
 Project Background – about who will do what, not




on project importance or what spurred you to do the
project
Audience – notes that project “will benefit” but not
what those benefits might be
Data Analysis/Interpretation Procedures – do not
identify techniques and why they were chosen
Data Analysis/Interpretation Procedures – do not
describe how data will be summarized (by cohort?
by gender?) or what comparisons will be made and
why
Budget – no specified expenditures and reader has
no idea what will be done or delivered for the
specified amount
What should Project Directors expect
from their Independent Evaluators?
 Your Independent Evaluator should honor that this is
your project, not hers or his.
 Your Independent Evaluator should feel free to and
be encouraged to give you feedback regularly—
privately as well as in annual evaluation reports.
 Your Independent Evaluator should receive your
input about evaluation activities with respect and be
able to explain why your suggestions can be
implemented or not.
Your relationship with your
Independent Evaluator
 Involve your Independent Evaluator as early in the project
as possible
 Communicate regularly with your Independent Evaluator.
Copy her or him on all project-related communications.
Check in to see how things are going every couple of
weeks or every month.
 Keep your Independent Evaluator involved as a “shadow”
at every step of the project—the utility of the evaluation to
your project and the quality of the evaluation will increase.
 Allow your Independent Evaluator to tell her or his truth
about the project—it may not all be positive, but if it
accurately reflects what you learned from your project—
both the wins and the failures, then it is fine. FIPSE is
interested in all forms of learning.
Selecting an Independent Evaluator
 While it’s okay to work with people you know, an Independent
Evaluator must have evaluation or social science research
expertise; it is inappropriate for someone related to or in a
relationship with you or someone on the project to be an
Independent Evaluator. It should be easy for you to make a
public case for this person to be your Independent Evaluator.
 Check around on-campus and at neighboring campuses and
institutions; there are evaluation or social science research
centers that are available to do contract work.
 Ask others you know who think their evaluators are useful to
their projects
 Ask your FIPSE Program Officer—s/he can often direct you to
folks who have evaluation expertise on the topic on which your
project focuses.
Evaluation and Your Project
FIPSE Comprehensive Program
Home
Evaluation Management Matrix
Using This Site
Evaluation Plan
Special Cases
Evaluation Tips for the
Lifetime of Your
Project
You and Your
Evaluator
FIPSE Performance
Indicators (GPRA)
Frequently Asked
Questions
Download Documents
Contact FIPSE
Project Goals
and
Objectives
Evaluation
Questions
Data
Source(s)
Data
Collection
Data Analysis
and
Interpretation
Reporting
Use
Evaluation and Your Project
FIPSE Comprehensive Program
Home
Data Collection Schedule
Using This Site
Evaluation Plan
Special Cases
Evaluation Tips for the
Lifetime of Your
Project
You and Your
Evaluator
FIPSE Performance
Indicators (GPRA)
Frequently Asked
Questions
Download Documents
Contact FIPSE
Data
Source(s)
Data
Collection
Method/Tool
Responsible
Time Frame
for Collection for Collection
Data Analysis
and
Interpretation
Time Frame
for Analysis
Responsible
for Analysis
Evaluation and Your Project
FIPSE Comprehensive Program
Home
Using This Site
Evaluation Plan
What’s the difference between
the Annual Evaluation Report and the Annual Project Report?
Special Cases
Evaluation Tips for the
Lifetime of Your
Project
You and Your
Evaluator
FIPSE Performance
Indicators (GPRA)
Frequently Asked
Questions
Download Documents
Contact FIPSE
Annual Evaluation Report
Annual Project Report
Author
Independent Evaluator
Project Director
How
submitted?
Appended to Annual Project
Report
To FIPSE office (online)
Audience
Project Director and
personnel
FIPSE and Department of
Education
Structure
Open
Web-based System
Length
Approx 5-10 pages
Varies with narrative length
What’s
included?
Much more detailed
coverage of evaluation data
collection and analysis. May
include formative data and
explanation of its use in the
project, as well as other
process observations.
What has been accomplished
in the past year; obstacles and
how they were handled;
changes in management,
policy, institutional support;
project financial summary.
Note: See following pages for differences between first-year and later year evaluation
reports and characteristics of a good report.
Evaluation and Your Project
FIPSE Comprehensive Program
Home
Using This Site
What’s the difference between
the Annual Evaluation Report and the Annual Project Report?
Evaluation Plan
Annual Evaluation Report
Annual Project Report
End-of-FirstYear of
Multiple Year
Grants
Should include more
explication and modification of
the 90-day Evaluation Plan
submitted if needed. The firstyear Annual Evaluation Report
is more of a progress report
that focuses on evaluation and
project processes and includes
an update of the data
collection schedule.
What has been
accomplished in the past
year; obstacles and how
they were handled;
changes in management,
policy, institutional
support; project financial
summary.
Years 2+ of
Four-Year
Grants or NoCost
Extensions
See comments above for
what’s included in the First
Year Evaluation Report.
Include discussion of FIPSE
performance indicators.
Same as above.
Final Reports
Include full analyses based on
and guided by the Evaluation
Plan. The audience for this
report is the Project Director
and FIPSE.
The Final Report follows a
similar structure to the
annual reports. Summary
evaluation results are
reported and the Final
Evaluation Report is
appended.
Special Cases
Evaluation Tips for the
Lifetime of Your
Project
You and Your
Evaluator
FIPSE Performance
Indicators (GPRA)
Frequently Asked
Questions
Download Documents
Contact FIPSE
Evaluation and Your Project
FIPSE Comprehensive Program
Home
Using This Site
Evaluation Plan
Characteristics of Good Reports to FIPSE
Special Cases
Evaluation Tips for the
Lifetime of Your
Project
You and Your
Evaluator
FIPSE Performance
Indicators (GPRA)
Frequently Asked
Questions
Download Documents
Contact FIPSE
Annual Evaluation Report
Annual Project Report
Includes an executive summary,
purposes and objectives of both the
project and evaluation, establishes
a baseline from which to work;
answers the evaluation questions
related to project goals; explains
how data collection was done, how
it is related to project activities and
why it is significant/important;
presents all forms of evidence (not
raw data, but summarized
information); conclusions,
recommendations, and feedback
about both the project and
evaluation.
Provides data that can be
supported; discusses honestly the
grants’ success; gives direct
indicators of institutionalization of
the innovation; explains how the
project will continue after funding.
Explains how the innovation was
disseminated and how others in the
field are adopting and/or adapting
it, or how the project has spawned
a network of institutions interested
in this reform. Discusses lessons
learned that will be of help to the
field.
Evaluation Final Report Outline






Executive Summary
Inquiry process
 Evaluation approach, questions, constraints
 Sampling, data collection methods & instruments, matrix
 Schedules, analysis procedures
Context and implementation of the program
Findings/program outcomes
Conclusions, interpretations, & recommendations
Appendices: instruments, protocols, interim reports, etc.
Adapted from:
(Torres, R. T., Preskill, H. S., & Piontek, M. E. (1996). Evaluation Strategies for Communicating
and Reporting: Enhancing Learning in Organizations. Thousand Oaks: SAGE.
Frechtling, J., Hood, S., & Hughes, S. (2002). The 2002 User-friendly handbook for project
evaluation. NSF 99-12175. Arlington, VA: NSF.
Questions?
 What information has been most useful to you
as a Project Director? Evaluator?
 What is the most useful format for sharing
evaluation information and resources with
you [web, PD meeting, email, print, etc.]?
 What additional evaluation information,
resources, and tools do you need?
 Additional questions?
Contact Information:
 Karen Paulson [[email protected]]
 Shelly Potts [[email protected]]