Transcript Slide 1

Center to Improve Project Performance
Making Personnel Development Effective:
Using Outcome Data for Program Improvement
Tom Fiore
Cynthia Helba
Susan Berkowitz
Michael Jones
Jocelyn Newsome
OSEP Project Directors’ Conference,
Washington, D.C., July 23, 2012
Overview
• A structure for using data for program improvement
 Project logic model
 Evaluation inquiry model
 Feedback loops
• Useful tools
 Surveys
 Sampling
 Interviews and focus groups
• Summary
2
Basic Project Logic Model
Goals
•Increase the
ability of
teachers to
make
evidencebased practice
decisions
about teaching
and
intervention
dilemmas that
arise in daily
practice
Activities
•Develop,
validate, and
disseminate
training
modules
Outputs
•Middle school
teachers
download and
report using
modules
Outcomes
•Middle school
SPED
teachers
demonstrate
increased
ability of to
make and
apply
evidencebased practice
decisions
3
Basic Project Logic Model—Data of Interest
• Output—something that is counted
 [Number of] Training modules developed
 [Number of] Training events held
 [Number of] Individuals trained
• Outcome—something that is measured
 Trainees report, at 6 months post-training, that they learned something
and are using it [percent of learning or percent of information used]
 Trainees demonstrate proficiency on trained skills [scores obtained
through an observation protocol]
 Customers of those trained demonstrate improved performance [scores
obtained through a standardized instrument]
4
Evaluation Inquiry Model
• A logic model provides a representation of the linear
process of the theory of action for the project
• An evaluation inquiry model is non-linear and is a
discovery model focused on the theory of change for the
project
• Specifically, it puts project planning and decision-making
functions at the center of the model, and it embeds
outcomes within that function
5
Evaluation Inquiry Model
Vision and
PurposeSetting
Planning
and
DecisionMaking
Implementation
Evaluation
 Formative
 Summative
Outcomes
1
2 3
[email protected]
6
Evaluation Inquiry Model
• Four functions to focus on when using data for
program improvement

Vision and Purpose-Setting
o Define the fundamental intention of the project
o Establish an overall direction and purpose

Planning and Decision-Making
o Allocate resources
o Design implementation strategies
o Establish measurable expectations
7
Evaluation Inquiry Model
• Four functions to focus on when using data for
program improvement (cont.)

Implementation
o Execute strategies and activities

Evaluation
o Establish metrics for measuring outputs and
outcomes
o Collect, analyze, and interpret data
8
Evaluation Inquiry Model
• Why outcomes are inside the Planning and
Decision-Making box—keeping an eye on the
prize


Having outcomes at the far right implies a strictly
linear process
o But projects that use data to follow their
progress don’t operate linearly
Also, having the outcomes at the end suggests
that they can be achieved and achieved in the life
of the project—that the work is done
9
Evaluation Inquiry Model with Feedback Loops
Vision and
PurposeSetting
Planning
and
DecisionMaking
Implement- A
ation
Evaluation
 Formative
 Summative
Outcomes
1
2 3
D
Feedback 1
Reflection
C
B
Feedback 2
Worth
[email protected]
10
Evaluation Inquiry Model with Feedback Loops
A. Evaluation and analysis plans are developed—outputs and
outcomes associated with activities are identified, selected,
and measured
B. Evaluative data are used to adjust plans, activities, and
management—implementation is or is not occurring as
expected
C. Evaluative data are used to determine the worth or value of
outputs and immediate outcomes in ultimately achieving the
desired long-term outcomes
D. Reflections on the worth or value of activities, outputs, and
outcomes lead to modifications of the vision/purpose, which
leads to changes in plans and activities
11
Collecting and Using Data as Feedback
Evaluative feedback will be useful to your project if:
• The evaluation data provide accurate information about
the results of your project’s activities and outputs
• The data are targeted so that they are useful for the
specific purpose of making changes to your project
• The data are accurate and can be collected within your
project’s resources
• The data are available in time to allow for change to your
project
12
Useful Tools for Obtaining Formative
Feedback
Three tools for collecting data for formative purposes:
• Surveys
- Perhaps the most common means of collecting data about professional
development activities
- Must be done correctly to obtain accurate and analyzable information
• Sampling
- An often overlooked way to make data collection more efficient and
accurate.
• Interviews
- Another common means of collecting feedback about project activities
- Method to understand data collected through other means, and to drill
down to specific information needed to make changes to a project
13
Surveys
Step 1: Data Analysis Plan
 Most important step
 Questionnaire development should be driven by
project goals
Project goals
What information do
you need to evaluate
your project?
14
Surveys
Developing a Data Analysis Plan
How will you ask?
Mode
choice
15
Surveys
Field Considerations
 What languages will you ask in?
 Shapes item development because of wording
and cultural implications
 Are there other components of your project
evaluation or your data collection activities, e.g.,
classroom observation or other surveys?
 Shapes length and administration of questionnaire
16
Surveys
Step 2: Item Development
Project
goals
Item topics
Items
17
Surveys
Reality Check
Respondent
stores
information
Respondent
comprehends
question
Respondent
retrieves
information
Respondent
chooses to
answer
question
Respondent
is able to fit
answer with
response
options
18
Surveys
Reality Check
Respondent
stores
information
Respondent
comprehends
question
Respondent
retrieves
information
•How many steps
did you take
yesterday?
•Do you favor or
oppose not
allowing the state
to raise taxes
without approval of
60% of the voters?
•What did you eat
for lunch last
Tuesday?
Respondent
chooses to
answer
question
•How often do you
steal candy from
small children?
Respondent is
able to fit
answer with
response
options
•How tall are you?
___ cm
If respondent cannot understand the question, cannot remember the answer (or never
knew it), doesn’t want to give you the answer, or can’t figure out how to give you the
answer, then you can’t ask the question.
19
Surveys
Elements of a Good Item
 Only asks one question at a time
(avoids double-barreled questions)
Do you exercise regularly and eat healthy foods?
 Contains a clear threshold for answering “yes”
Have you seen a doctor in the past month?
 Provides a timeframe
How many times do you eat in a restaurant?
 Provides a timeframe appropriate to the topic
In the past 12 months, how many times have you sneezed?
20
Surveys
Elements of a Good Item (continued)
 Uses clear terminology and plain language
How strong is your fertility motivation?
 Gives exhaustive and mutually exclusive
response options
From which of these sources did you first learn about the
tornado in Derby?
Radio
Television
Someone at work
While at home
While traveling to work
21
Surveys
No item is an island
 How questions are answered can be shaped by:
 Ordering of items
Should U.S. reporters be allowed into the Soviet Union?
Should reporters from the Soviet Union be allowed into the U.S.?
 Mode of administration
22. How many sex
partners have you had
in the past 12 months?
How many sex
partners have you
had in the past 12
months?
________
VERSUS
22
Surveys
Step 3: Review and Testing
 Content Expert Review
A content expert review can identify improper use of terms
or concepts in the field
How many students do you serve under Part C?
VERSUS How many children do you serve under Part C?
 Methodological Expert Review
A methodological review can identify well-known
(but not obvious) issues with items
In the past week VERSUS In the past seven days
23
Surveys
Step 3: Review and Testing (continued)
 One-on-One Testing
This testing involves an in-depth interview with people like your
respondents. It explores how they are understanding and
answering the questions.
Has a student in your classroom been expelled this year?
When did that happen?
Tell me how you came up with your answer
 Field Testing
A field test is an opportunity to see how the questionnaire
actually works, identifying any items respondents have difficulty
answering (or interviewers have difficulty asking). It is also an
opportunity to test procedures and protocols.
24
Surveys
Step 4: Rinse and repeat.
Project
goals
Testing
Item
topics
Review
Item
development
25
Surveys
Reading List
 Don Dillman, Jolene Smyth & Leah Melani Christian. Internet, Mail,
and Mixed-Mode Surveys. 3rd edition. 2009.
 Robert Groves, Floyd Flower, Mick Couper, James Lepkowski,
Eleanor Singer & Roger Tourangeau. Survey Methodology. 2004.
 Janet Harkness et al. Survey Methods in Multinational,
Multiregional, and Multicultural Contexts. 2010.
 Gordon Willis. Cognitive Interviewing: A Tool for Improving
Questionnaire Design. 2005.
26
Sampling
• A Hypothetical Example
• A program is implemented in 20 HSs
Montgomery County, MD
• Within each school, 10 teachers
selected to be trained
• Once trained, teachers instruct
“upper class” students
• Trained students then instructed to
mentor 3 to 5 freshmen throughout
the school year
27
Sampling
• A Hypothetical Example (continued)
 Program needs to be evaluated
• Have teachers been trained?
• Once trained, have they instructed students in the
program?
• Once students are instructed, have they mentored
fellow students?
• Is it having a positive effect on the mentored
students?
28
Sampling
• Sample vs. Census
Census: every unit in a population
(e.g., schools, or persons) is surveyed
Example: all teachers, all mentor
students, and/or all mentored students
29
Sampling
• Sample vs. Census (continued)
Sample: a portion of the population is
surveyed
Example: a portion of the teachers, of the
mentor students, of the mentored students
are selected
30
Sampling
• Why sampling? Why not a census?
Sampling
1) saves time,
2) saves money, and
3) if done correctly, answers questions
more accurately
31
Sampling
• How can a sample be more accurate than a
census?
 Generally, a census is a mammoth undertaking
• Many kinds of errors can be introduced
– A lower response rate
– Lax data integrity
A sample may allow for increased nonresponse
follow-up
More resources can be spent on data integrity
32
Sampling
• Considerations when implementing sampling
How precise do your estimates need to be?
• Do you only want a general idea of how the
program is doing, or
• Are you making comparisons between groups?
33
Sampling
• Considerations when implementing sampling
(continued)
 How large should the sample be?
Depends on many factors
• Complexity of the sample
– Often, the more complex the sample is, the
larger it needs to be
– Try to keep it “simple”
34
Sampling
• Considerations when implementing sampling
(continued)
 A sample should be RANDOM
• Results can be generalized to the population
• Leads to unbiased estimates
• Increases credibility of results
35
Sampling
• Considerations when implementing sampling
(continued)
 If a sample is not RANDOM
• Can lead to biased estimates and poor credibility
 Not all samples can be random
– Avoid “convenience” samples
• The sample should be designed during program
planning
36
Sampling
• Considerations when implementing sampling
(continued)
 The complexity of survey instrument
• How long is the instrument?
• Does the survey use established measures or
new and unproven measures?
37
Sampling
• Considerations when implementing sampling
(continued)
 What resources are available?
• How much time and money can be spent?
• How many people are available to work on the
survey?
– from inception to data collection to analysis of the
data
38
Sampling
• Another Consideration – Measurement Scales
 E.g., ordinal
 Rank your favorite three flavors of ice-cream
1. Mint chocolate chip
2. Toffee bar crunch
3. Vanilla
Is 1 close to 2, or is 1 a clear favorite?
39
Sampling
• If you have questions…
...consult a statistician and/or a survey
methodologist.
40
Sampling
• The statistician should ask you questions
– What is it that you’re trying to answer?
– What are you measuring?
– How precise do you need your estimates to be?
– Are you looking to make comparisons between
groups?
– Etc. etc.
 The answers to these kinds of questions will determine
what kind of sample design will best suit your purposes
and how large your sample should be.
41
Interviews and Focus Groups
• What Are Qualitative Methods of Data Collection?
• Both in-depth interviews and focus groups pose
questions designed to ascertain and explore people’s
views on a given subject in their own terms and
framework of understanding.
• Both use semi-structured or open-ended instruments
that concentrate on clearly defined subjects or sets of
interrelated topics.
• Often (but not only)used to explore how a program or
project is being implemented (process evaluation)
42
Interviews and Focus Groups
• What Are Qualitative Methods of Data Collection?
(continued)
• Intensive interviews are usually conducted one-onone, by phone or in person; can vary in length from
30 minutes to several hours (more than about 1 ½
hours in one sitting usually too long)
• Focus groups pose a question or series of questions
on a selected topic or theme to a group of 4-10
persons
43
Interviews and Focus Groups
• Why Use Qualitative Methods of Data Collection?
• Can add depth of understanding to evaluation
question(s) not obtainable via close-ended survey
responses; allow for probing and back-and-forth
between interviewer/moderator and participants
• Can be complementary to survey component (e.g.,
help to explain survey responses) or stand alone
(e.g., as a way of gaining deeper understanding of
selected issues)
44
Interviews and Focus Groups
• Selecting Cases for Qualitative Methods
• Principles and goals of purposive sampling not the
same as for probability sampling
• To add deeper understanding of “information-rich”
cases
• Depends on goals of evaluation/important analytic
categories: e.g., can select to achieve regional
coverage; to account for differences in school,
program or classroom structures; can choose only
“successful” cases, can select for typical or for
extreme cases
45
Interviews and Focus Groups
• Designing In-depth Discussion Guides and Focus
Group Moderators Guides
• Concise; logical development of themes or topics
(may not correspond to order of evaluation questions)
• Questions should be open-ended (not yes/no), not
leading (don’t you think that?), lend themselves to
thoughtful exploration
• Framed broadly enough to encourage discussion, but
not so broadly respondent has no idea what is being
sought (“Tell me about yourself”)
46
Interviews and Focus Groups
• Common Misconceptions about Qualitative Data
Collection
• It’s easy
• It’s “purely subjective”
• Anyone with good people skills can be a good
qualitative interviewer/focus group moderator
47
Interviews and Focus Groups
• Qualities Needed in Qualitative Data Collectors
• Not the same as those for interviewers administering
close-ended surveys
• Should have knowledge of subject matter
• Active role
• Ability to guide dialogue, draw out respondents,
probe on important issues but w/o leading/steering
• Adaptable to context but not ceding control
48
Interviews and Focus Groups
• Analysis of Qualitative Data
• Value of collecting qualitative data can be seriously
impaired w/o someone who has the experience and
expertise to make systematic sense of it
• Qualities needed in analysis of qualitative (wordbased, thematic) data not necessarily the same as
those important in collection of these data
• Best when combined/integrated well with survey
findings (if mixed method study)
49
Interviews and Focus Groups
• What Value Does Qualitative Data Add to a
Formative Evaluation?
• Can identify important and/or unexpected themes,
issues or areas of concern that can be acted upon in
“real time”/enable course correction
• Better understanding of “world view” and issueframing of specific subgroups of interest
• Allows for a more fluid process and evolving
understanding in guiding both the evaluation and the
program or project that is the subject of an
evaluation)
50
Summary
• Provided a structure for conceptualizing the use of
outcome data to improve program performance
• Presented ideas about how to effectively collect
accurate survey data
• Urged consideration of the use of sampling, and
explained some of the factors to contemplate in
deciding whether to use sampling
• Described uses of and techniques for employing
interviews and focus groups
51
Whom CIPP Can Help
CIPP provides technical assistance on formative
and summative evaluation to current OSEP
grantees funded through the following programs:
• Parent Information Centers
• Personnel Development
• Technical Assistance and Dissemination
• Technology and Media Services
52
CIPP Contact Information
 Center to Improve Project Performance (CIPP)
888-843-4101
[email protected]
 Tom Fiore, Westat
919-474-0349
[email protected]
 Patricia Gonzalez, OSEP Project Officer
202-245-7355
[email protected]
53