Transcript Slide 1

Developing
Implementation
Evaluation Models
To Provide Assistance to the National Science Foundation’s
Catherine Callow-Heusser
Project Director, Co-PI
Evaluation Capacity Building Project
A NSF-Funded
Research, Evaluation, and Technical Assistance
(MSP-RETA) Project
2
Goals of NSF’s MSP Program



The Math and Science Partnership (MSP) program is a major
research and development effort that supports innovative
partnerships to improve K-12 student achievement in
mathematics and science.
MSP projects are expected to both raise the achievement
levels of all students and significantly reduce achievement
gaps in the mathematics and science performance of diverse
student populations.
Successful projects serve as models that can be widely
replicated in educational practice to improve the mathematics
and science achievement of all the Nation's students.
(NSF’s MSP RFP: http://www.nsf.gov/pubs/2003/nsf03605/nsf03605.htm)
3
MSP’s Five Key Characteristics

Partnership-Driven
 Higher
Ed + K-12 + Others
Teacher Quality, Quantity, and Diversity
 Challenging Courses and Curricula
 Evidence-Based Design and Outcomes
 Institutional Change and Sustainability

4
Math-Science Partnership Program
MSP
Funding,
Intervention
Student
Achievement
5
Inside the MSP “Black Box”
MSP Goals
and $$$
Professional Development Community Involvement
Mentoring
Partnerships
Recruitment
Challenging Curriculum
Teacher Retention
Teacher Leaders
Universities
Tutoring
Summer Workshops
Pre-Service Redesign
K-12
Scientists/Engineers
Business
6
Increased
Student
Success in
Math &
Science
7
Westat (2003). [http://www.mspinfo.com/Source/Chap9_Evidence_and_Evaluation.asp]
Example from MSP Strategic Plan

Goal:


To increase student achievement and reduce achievement gaps
in science and mathematics for all preK-12 students in partner
school districts.
Strategies for achieving goal:



Work with districts to develop and implement strategic plans for
improving math and science achievement and reduce
achievement gaps.
Work with districts to develop internal leadership structures and
practices—among teacher-leaders, principals, and district staff—
to improve teaching of math and science.
Provide well-designed, continuing professional development to
help teachers learn new content and practices, become more
attuned to students’ thinking, and use new curriculum materials
aligned with state and national standards.
8
Components in the “Black Box”
MSP
Funding,
Intervention
Student
Achievement
Professional
Development
Curriculum
9
Simplified Theory of Action for Example
Recruitment,
Retention
Activities
MSP
Funding,
Intervention
Family,
Community
Involvement
Leadership
Professional
Development
District
Resources
Student
Learning
Teacher
Knowledge,
Practice
Curriculum
10
Student
Achievement
Implementation Evaluation


Definition (Scriven, 1991): “mere monitoring of program
delivery”
Definition (Frechtling, 2002, Gao, 1998): assess whether
the project is being conducted as planned, e.g., fidelity of
implementation


Ensure the program and its components are operating, and
according to the proposed plan or description
Monitor and evaluate well-articulated activities and
processes* in the “black box”
* “A process is a series of causally linked events or changes taking place over time” (Scriven)
11
Why Implementation Evaluation?
Ensure that activities are implemented as
PLANNED in a timely manner.
 Indicators are based on PLANS for project
activities--PLANS that





Explain the project’s rationale
Document the context in which a project operates
Describe the planned activities and processes
Identify potential side effects
12
Implementation Evaluation
Answers questions such as (Westat, 2003):





Were the appropriate participants selected and involved in the
planned activities?
Do the activities and strategies match those described in the
plan? If not, are the changes in activities justified and
described?
Were the appropriate staff members hired and trained, and are
they working in accordance with the proposed plan? Were the
appropriate materials and equipment obtained?
Were activities conducted according to the proposed timeline?
By appropriate personnel?
Was a management plan developed and followed?
13
“Models” for Describing & Monitoring

Program Logic Modeling


Picture of how a program works, including the theory
and assumptions underlying the program
Logic Model Development Guide
W. K. Kellogg Foundation, http://www.wkkf.org

Key Evaluation Checklist

Checklist for evaluating/reporting on programs &
evaluations of them
M. Scriven, http://www.wmich.edu/evalctr.checklists/kec.htm

Others
14
Program Logic Modeling

What?
Systematic and visual method for presenting
relationships among program resources,
activities, and anticipated changes or results.

Why?
Provides a “road map” describing the
sequence of related events/processes that
connect the need for the program with the
desired results.
15
The Importance of Logic Modeling

Why programs often run into trouble –


Lack of well articulated, research-based, experience-based
theory or road map.
Failure to follow the road map during the trip!
If program planners don’t have any hypotheses guiding them, their
potential for success is limited as is there no potential for
learning – the program is probably in trouble! (1)

Why evaluations often run into trouble –

Lack of well articulated, research-based, experience-based
theory or road map.
The bane of evaluation is a poorly designed program! (1)
(1) Kellogg (2001); McLaughlin (2003)
16
17
University of Wisconsin-Extension, Program Development & Evaluation, http://www.uwex.edu/ces/pdande/progdev/index.html
18
University of Wisconsin-Extension, Program Development & Evaluation, http://www.uwex.edu/ces/pdande/progdev/index.html
19
Westat (2003). [http://www.mspinfo.com/Source/Chap9_Evidence_and_Evaluation.asp]
MSP Project Logic Models

Show relationships, links between




Resources (inputs) from NSF, Higher Education,
K-12, Partners
Activities and processes that will address MSP five
key characteristics
Outcomes—short, intermediate, and long term
Complex!
Nested, multiple “levels” or depths
 Require thoughtful, thorough, rigorous, systematic
planning and development

20
Key Evaluation Checklist

What?
 Checklist
of necessary items to be addressed
(iteratively) in a program evaluation.

Why?
 Avoid
invalidity in a program evaluation.
 Align proposal/plan and evaluation.
21
Key Evaluation Checklist Components







Description*
Background, context*
Consumers
Resources
Values
Processes*
Outcomes







* Used for Implementation Evaluation
22
Costs
Comparisons with
alternative options
Generalizability
Significance
Recommendations
Report
Meta-evaluation
Key Evaluation Checklist
Background and Context





Historical, contemporary, projected settings
Stakeholders
Relevant legislation, funder’s policy changes
Underlying rationale (e.g. program theory,
political logic)
Review of previous research and evaluations
23
Key Evaluation Checklist
Description and Definitions




Definitions of “technical terms”
Official description of program and components
Detailed description for replication
Goals, mileposts, benchmarks
24
Key Evaluation Checklist
Processes





Assessment of the quality of everything significant
that happens or applies before true outcomes emerge
Causally relevant context and support
Goals, design, degree of implementation,
management, quality of work, activities, procedures
Quality of inputs (i.e., logic model resources)
Intermediate results (i.e., logic model outputs)
25
Key Evaluation Checklist Applied to MSP Projects

Goes from

“What’s So?”
Step I: Fact finding phase

To “So What?”
Step II: Combining facts with values that bear on those facts

Complex!


Iterative, multi-step
Requires thoughtful, thorough, rigorous, systematic
planning and development
26
Complexity of Implementation Evaluation “Models”

Implementation evaluation requires




Accurate description of project contexts, activities, processes,
and the relationships between them
Realistic benchmarks, measurable indicators
Regular monitoring of project plans, activities, processes,
timelines
Complex!



Nested designs with multiple “levels” or depths
Iterative, multi-step methods for planning and documentation
Require thoughtful, thorough, rigorous, systematic planning,
development, and evaluation
27
USU’s MSP-RETA Project



Provide evaluation technical assistance to
MSP projects
Collect evaluation needs assessment
information
Build upon existing evaluation “models” or
processes to develop evaluation processes that




Address the complexity of MSP projects
Help identify and measure causal effects
Incorporate relevant contextual factors
Involve stakeholders
28
Culture of Evidence
In particular, we are working to help MSP
projects build a “Culture of Evidence” to
meet NSF’s goal of identifying successful
projects that will “serve as models that can
be widely replicated in educational
practice to improve the mathematics and
science achievement of all the Nation's
students.”
29
References
Frechtling, J. (2002). The 2002 user-friendly handbook for project evaluation. Washington, DC: NSF.
[Document Number 02-057]
GAO. (1998). Performance measurement and evaluation: Definitions and relationships. Washington,
DC: U.S. GAO. [http://www.gao.gov/special.pubs/gg98026.pdf]
McLaughlin, J.A. (October, 2003). Logic modeling: A tool for describing and aligning your program to
your monitoring and evaluation. A presentation at USU’s MSP Building Evaluation Capacity of
STEM/MSP Projects Workshop, Baltimore, MD.
Scriven, M. (1991). Evaluation thesaurus, 4th ed. Newbury Park, CA: Sage.
Scriven, M. (2002). Key evaluation checklist. Kalamazoo, MI: Western Michigan University, The
Evaluation Center. [http://www.wmich.edu/evalctr/checklists/kec.htm]
University of Wisconsin-Extension, Program Development and Evaluation. (2002). Enhancing
program performance with logic models. Madison, WI: Author. [http://www.uwex.edu/ces/pdande/
and http://www1.uwex.edu/ces/lmcourse/]
W. K. Kellogg Foundation. (2001). Logic model development guide. Battle Creek, MI: Author.
Westat, Inc. (2003). Developing math and science partnerships: Toolkit. Rockville, MD: Author.
[http://www.mspinfo.com/Source/toolkit.asp]
30
Contact Information
USU’s MSP-RETA Evaluation Capacity Building Project
PI, Project Director: Catherine Callow-Heusser ([email protected])
Co-PI: Jim Dorward ([email protected])
Co-PI: Steve Lehman ([email protected])
PI (retired): Blaine Worthen
Consortium for Building Evaluation Capacity
http://www.usu.edu/cbec/
2810 Old Main Hill
Utah State University
Logan, UT 84322-2810
435-797-1111
FAX 435-797-1448
[email protected]
31