Transcript Slide 1

FINDING THE VALUE
IN EVALUATION
What should a Noyce director expect from program
evaluation?
July 2011
Susan Tucker, E&D Associates LLC
Davida Fischman, CSU-San Bernardino
Who are we?
2
Davida Fischman:
• Research mathematician turned mathematics educator
• 17 years teaching pre-service (elementary and secondary)
teachers; 10 years working with in-service teachers in small and
large grants
• Co-designer and Coordinator of CSUSB MA in Teaching Math
program
Susan Tucker
• 25 years as educational program evaluator
• 20 years teaching educational program evaluation and working in
teacher education programs
• experience as K12 teacher, principal, associate
superintendent, university professor, PI and grant director, grant
writer
Who is in the room?
Complete Mobile Survey #1
3
•
•
•
•
How "old" is your project?
How many grant-funded projects have you managed?
What experience do you have in project evaluation?
What are your goals/expectations of this session?
• TEXT to:
96625
• Message:
E&D1
Agenda
4
•
•
•
•
•
•
•
•
From the PI perspective
What is program evaluation?
Negotiating a good evaluation plan
Tools for evaluation
Data collection
Using evaluation results
Tips for PIs
Resources
From a PI... A Changed Perspective
5
•
•
First thoughts: 10-12%?? What for??
Then...
o
o
o
o
•
Using new survey of Noyce Scholars and Mentors to modify next
year's work
Using formative evaluation from NSF MSP project to inform
program decisions on an ongoing basis
Add-on of evaluation of Noyce Scholars and Mentors tool will
continue to use to learn about participants needs and made
adaptation decisions
Now use surveys for formative assessment also in university classes
Today…
•
a much better understanding of the value of evaluation, and ways
it can improve the project.
Define: “Program Evaluation”
6
• Think and jot down notes:
What do you get from your Noyce evaluation
today?
o What more do you want?
o
• Changing views of evaluation
o
Prove vs. Improve
Warm up/Introductions/Review
7


Who are you? What do you do?
What disciplines, connections, experiences do you
bring into evaluation?
How do they help you think about evaluation?
 What previous backgrounds or experiences do you bring that
might assist you in maximixing the value of evaluation?


How do you currently think about the role of
evaluation in your NOYCE project?
Strengths
 Challenges/Frustrations

Agree or Disagree about the
characteristics of a good evaluator:
8
①
②
③
④
⑤
⑥
⑦
⑧
⑨
⑩
⑪
…is part facilitator, part researcher, part manager and part program specialist
… is external to the program being evaluated
… designs an evaluation to determine if a program is well managed
…negotiates questions of relevance to multiple audiences—need to know vs nice to
know
…Is collaborative in terms of designing and implementing the evaluation plan
…helps a project reflect on the quality of its objectives
…helps a project look at more than just whether its goals have been met
… helps staff develop a logic model that describes how a program’s components
relate to each other and to the overall goals and objectives
…selects/aligns evaluation model to complement the project’s logic model
…develops a plan to determine if a program is meeting its goals & objectives
…is concerned about how useful the evaluation is to project stakeholders
Program Evaluation Defined?
9
“Program evaluation is the systematic
collection of information about the activities,
characteristics, and outcomes of programs
to make judgments about the program,
improve program effectiveness, and/or
inform decisions about future programming”
(Patton, M. Q. 2002).
2
Engage
stakeholders
10
1
Prepare for
the evaluation
9
Disseminate and
use the results
3
Identify purpose
of the evaluation
4
Negotiate the
right questions
Does your
Evaluation
Do this?
5
Co-Design
the evaluation
8
Analyze
the data
7
Collect
the data
6
Select and adapt
instrumentation
10
What Can an Evaluation Tell Us?
11



What is working
How to improve
Support for evidence-based decision-making
Are we achieving the results that we were hoping for?
 Are results being produced as planned at all levels of our
logic model?


Relevance


Do stakeholders care about what we are doing? Are we
solving the problem?
Cost-Effectiveness

Are we getting value for money?
Many models of evaluation…
a few examples popular in education
12

Scientific-experimental models
 Objectives

based research orientation
Management models
 Stufflebeam’s

CIPP model
Anthropological/Qualitative models
 Stake’s
responsive model
 Looking at intended & unintended outcomes

Participant oriented models
 Fetterman’s
empowerment model
Qualities of an evaluator
13






Formal education (evaluation preferably)
Experience (with program improvement)
Evaluation philosophy complements management
team and grant’s principles
Communication skills
Recommendations and resume
Understand culture of target population(s)
Start with the Right Questions…
14

Include questions of relevance to stakeholders.

Explore what makes questions “relevant”


Determine what will be accepted as evidence in
seeking answers to the questions
Examine whose voices are heard in the choice of
questions and evidence
Task 1: Are the following “good”
evaluation questions?
15
1. What are the goals and activities of the teacher certification
programs from which Noyce grant is housed?
2. What is the “value added” of your Noyce program?
3. How do stakeholders perceive the Noyce Program and Noyce
recipients?
4. What are the characteristics of the schools in which Noyce
recipients teach?
5. What are the relationships between characteristics of the Noyce
Program, types of Noyce recipients, and recipients’ plans to go
into/stay in teaching and leadership roles?
6. What is the impact of Noyce on teacher:



Recruitment?
Retention?
Teacher effectiveness?
Design the Evaluation Plan
16



Negotiate plan with multiple stakeholders
#1. Brainstorm what are important questions to ask
#2. Probe rationale/values behind each question



Build design appropriate to both evaluation questions and cultural
context
Align evaluation questions to project logic model
#3. Worry last about how & what & when to measure


Seek culturally appropriate methods that combine qualitative and
quantitative approaches.
Collect data at multiple points in time, extending the time frame of
the evaluation as needed.
Major Questions
Asked
Sources of
Evidence
Quality
Criteria/Standards
Contexts
Qualitative
Quantitative
Mixed
Improvement oriented
Holistic
360 degrees represented
Assumptions overt
Culturally responsive
•Inputs
•Resources
•Expectations
Processes:
•Individual
•group
Products:
• Anticipated
• Unanticipated
Qualitative
Quantitative
Mixed
Qualitative
Quantitative
Mixed
Polemical:
•Inductive & deductive
• Long & short term
•Local & Research based
knowledge/wisdom
Identify quality criteria…
some examples
18





Persistence and success on a STEM trajectory from
teacher prep to teaching jobs
Retention
Changes in teacher pedagogy and content
knowledge
Willingness to teach STEM classes
Obtain advanced training in teaching in STEMrelated areas.
Summative vs. Formative Evaluation
19
Formative Evaluation
Purpose: Program improvement
Summative Evaluation
Program accountability
Judgment of overall worth,
value
Did this educational program
General Is this educational program
Question being implemented as planned contribute to the planned
to achieve set goals?
impact and compensate the
:
resources utilized?
Specific • What are the strengths and
Question weaknesses?
• What is working and not
s:
working?
• Why is it working or not
working?
• How should it be improved?
• What are the program
results?
•Did intended audience
benefit from the program?
• Was the program cost
effective?
•Is it worth to continue this
program?
Question Types & Data Techniques
20
Descriptive
Available data
Case studies
Statistical survey
How did the program
unfold? To what extent
was the project coherent
from the start...
Normative
Available data
Case studies
Statistical survey
Cost-benefit analysis
Cost-effectiveness
analysis
How satisfactory was
the placement rate after
training?
Impact-focused
Search for causal
relations
Statistical analysis
Forecast analysis
Did the “Noyce Package”
support successful hiring
and retention of Noyce
participants?
18/07/2015
20
Context Q 1. Why was program planned?
2.
3.
4.
5.
21
Process Q:
How were participants selected?
How was program/PD planned?
What is setting
What resources are available?
1. how do major ongoing events match original
expectations ?
2. How effective are resources?
3. How are dissemination/institutionalization
efforts emerging?
4. What policy issues are emerging?
Product Q: 1. to what degree have project goals been met?
Summative 2. what is transferrable or replicable as a result of
project?
3. what are recommendations re:
Formative
- prerequisites for success, selection criteria, training content, processes/sequencing, resources, follow-ups, R&E
Match with Evaluation Criteria
22
Goals, Purposes
Impact
Main Objective
Effectiveness
Activities
Efficiency
Means
Logical structure
18/07/2015
Replicability
Coherence
Direct Results
Sustainability
Relevance
Questions @ Questions
23





What/whose perspectives are represented in the
evaluation questions?
What other questions might have been posed?
Whose perspectives are accepted as credible
evidence?
Credible to whom?
How well does the time frame in this study match
the needs and rhythms of this context?
Mobile survey: #2
24
96625
•
TEXT to:
•
Message: E&D1
Agree/Disagree @ evaluators:
25
①
②
③
④
⑤
⑥
⑦
⑧
⑨
⑩
⑪
…collects both qualitative and quantitative data.
…has a firm grasp on educational research strategies
…collects data that is actionable—answers provide info needed for decisionmaking
…is seen but not heard except at the end of each year to write annual reports
…designs data-collection forms, procedures and databases to capture and record
data collected.
…is culturally competent and responsive to unique needs of a project
…analyzes data in timely ways to help a project improve as it develops
…clearly distinguishes between descriptions and judgments when presenting
findings.
…makes recommendations to the program regarding ways to improve
…works with the project staff to disseminate the findings
…asks questions about sustainability and institutionalization early and often
Collect the Data
26

Be holistic:
 collect
qualitative & quantitative data
Be responsive to cultural contexts
 Tap into internal & external evaluation
 Triangulate vs. one-shot data


Usually takes 3-6 months of eval planning & prep
before data collection can begin
Analyze the Data
27






Consider context/inputs and resources as a
necessary component of interpretation.
Disaggregate data to examine diversity within
groups
Examine outliers, especially successful ones
A “cultural” interpreter may be needed to capture
nuances of meaning.
Stakeholder review panels can assist in accurate
interpretation
Confirm accuracy of analysis & interpretation before
making judgments
Disseminate & Use the Results
28


Inform a wide range of stakeholders
 Cultural sensitivity and responsiveness increases
both the truthfulness and utility of the results
 Involve/Engage a variety of stakeholders
 Find and train information users
 Personal Factor greatly accelerates eval use
Make use consistent with the purpose of the
evaluation
 Situate interpretation of results
 Use results to make decisions about program
improvement
Task:
29

What data collection procedures are you
considering when in designing next year’s
Noyce evaluation?
1.
2.
Existing data
New data collection plans
Tools that help:
30

EX 1: Logic modeling
•
•
•
Planning tool
Flow diagram of your program with defined goals,
inputs, outputs, and outcomes connected through causal
links
Visual representation of what and how a program
produces its outcomes
Resources
Program Logic Model
31
Activities or inputs
Products or outputs
Short-Term (immediate) Outcomes
(knowledge, skills, and abilities, changes in
the environment)
Mid-Term Outcomes
(behavior change, application of new
skills/tools, impacts on environment)
Long-Term Outcomes
(results or change/improvement in issue or
effectiveness)
32











Stage I: PREPARE
1 Organize for Collaborative
Work
2 Build Assessment Literacy
Stage II: INQUIRE
3 Create a Data Overview
4 Dig into Data
5 Examine Instruction
Stage III: ACT
6 Develop an Action Plan
7 Plan to Assess Progress
8 Acting and Assessing
EX 3: Some Evaluation Methods
33
Kirkpatrick’s Evaluation Levels
1
Reaction
2
Learning
3
Behavior
4
Results
Survey
●
●
●
●
Questionnaire/Interview
●
●
●
●
Focus Group
●
●
●
●
Methods
Knowledge Test/Check
●
Work Review
●
●
Skills Observation
●
●
Presentations / Teach Bk
●
●
Action Planning
●
●
Action Learning
Key Business HR Metrics
●
●
●
Impacts of Noyce Projects
34










Number of teachers trained of K-12 science and math
Number of students directly impacted
Number of partner schools involved
Gains in teacher content knowledge
Gains in student achievement
Improved teaching strategies
Increased student achievement
Increased funding for science supplies & equipment for the
region’s schools
IHE faculty visiting schools on a regular basis
Creating innovative course processes/materials
Design Challenges
35

Based on faulty logic


Selected strategy or activities cannot make intended changes
Failure to connect with the target population (s)
Do not reach them
 Do not resonate with them
 Not understood by them


Failure to be well implemented
Settings inappropriate
 Incompatibility between program and delivery setting
 Unrealistic (untested) expectations

Challenges: Assessment issues
36









Measuring problem-solving skill
Statistical Significance
Adequate sample size
Cost of some assessment methods – how much should
projects spend?
Details - effective controls in matched comparisons
Longitudinal effects may be important but realized
“down the road”
Tests are not always the best measure of student
achievement
Hard to “analyze” qualitative impact data
Standard test culture is focused on “factual knowledge”
Tips for Noyce Evaluators
37
•
•
In order to get the most out of program evaluation, you
need to figure out what questions do you want answered.
The goals of each component of the evaluation process
need to be clear in your mind;
•
•
•
often these can be negotiated with your evaluator, but
certainly they should be laid out clearly and discussed.
PI-Evaluator team – review evaluation plan annually
Use the information you get!
•
Even if it seems that you have wonderful rapport with
participants, they might look at things differently when a
third party asks the questions, and you'll learn more.
References
38







Boudett, K., City, E, and Murnane, R. J., Eds. (2005). Data Wise: A step-bystep guide to using assessment results to improve teaching and learning.
Cambridge, MA: Harvard Education Press.
Gadja, R Community of Practice Collaboration Rubric:
www.asdk12.org/depts/cei/about/communities/CollaborationRubric.pdf
Kirkpatrick, D.L. and J.D. Evaluating Training Programs, 3rd Ed., BerrettKoehler Publ., Inc. San Francisco, CA, 2006
Patton, M. Q. (2002). Qualitative research and evaluation methods (3rd Ed.)
p. 10. Thousand Oaks, CA: Sage Publications.
Patton’s (2003) Qualitative Evaluation Checklist:
www.wmich.edu/evalctr/checklists
Scriven, M. (1967). The methodology of evaluation. In R. E. Stake (Ed.),
Curriculum evaluation. American Educational Research Association
Monograph Series on Evaluation, No. 1, pp. 39-83. Chicago: Rand
McNally.
Wholey, J. S., Hatry, H. P., Newcomer, K. E. (2010). Handbook of practical
program evaluation (3rd Ed.) pp5-60. San Francisco: Jossey-Bass.
References
39





W.K. Kellogg Foundation Evaluation Toolkit:
http://ww2.wkkf.org/default.aspx?tabid=75&CID=281&NID=61&L
anguageID=0
W. K. Kellogg Foundation Evaluation Hand Book. (1998).
http://www.publichealth.arizona.edu/chwtoolkit/PDFs/Logicmod/ch
apter1.pdf
The Centers for Disease Control and Prevention provides a set of
evaluation resources in a variety of topical areas, available at:
http://www.cdc.gov/eval/resources.htm.
Program Development and Evaluation (University of WisconsinExtension) http://www.uwex.edu/ces/pdande/evaluation/
Worthen, B. R., Sanders, J. R., and Fitzpatrick, J. L. (1997). Program
evaluation: Alternative approaches and practical guidelines. (2nd Ed.)
p.7. New York, Longman Publishers.