DESIGNING EVALUATION INSTRUMENTS Class X AEE 577 • Upon completion of this lesson, students should be able to: – List the step by step.
Download
Report
Transcript DESIGNING EVALUATION INSTRUMENTS Class X AEE 577 • Upon completion of this lesson, students should be able to: – List the step by step.
DESIGNING EVALUATION
INSTRUMENTS
Class X
AEE 577
• Upon completion of this lesson, students should be able to:
– List the step by step procedures for developing quality
evaluation instruments;
– Describe the errors that must be controlled in evaluation
instruments;
– Develop different forms of questions to record outcomes such
as change in knowledge, attitudes, skills, aspirations, and
behaviors;
– Write process evaluation questions;
– Describe reliability and validity
– Identify double barreled questions; and
– Develop an evaluation instrument.
How to Design Your Data
Collecting Instrument?
•Where to begin?
Begin with the information
needs of key stakeholders
• Information needs for program improvement
• Information needs for accountability
Designing Instruments
Step 1: Identify the type of data and
information you need to collect.
• Focus on the information needs of key
stakeholders.
• Clearly identify what data and information are
needed to collect for this purpose.
• Identify the major categories of information
that you need to collect.
• List subcategories of information under major
categories.
Designing Instruments
Step 2: Develop the “Sketch” of
Your Instrument.
• List the major items in your instrument to
structure it.
• Organize the structure of your instrument to
collect needed data.
• Organize subcategories under each major topic.
• Include the demographic data collection section
at the very end of the instrument.
Designing Instruments
Step 3: Identify Necessary Scales
and Questions.
• Determine the types of scales you need to
include in your instrument.
• Determine the types of questions you
need to ask.
Designing Instruments
Step 4: Be Consistent in Numbering
Answer Choices and Scales
•
•
It is a good idea to use low numbers for lower
manifestation of a measuring variable.
Example:
1.
2.
3.
4.
•
High school diploma
Bachelor’s degree
Master’s degree
Doctorate
By using a consistent pattern throughout the
instrument you can easily interpret results.
Designing Instruments
Step 5: Writing Questions
• As a general rule, when writing questions, you must ask “why am I
asking this question?”
• Remember your evaluation information needs always.
• Think about the answer before you write any question.
• There are two ways to write a question
– Open-ended
– Example: What methods do you use to educate farmers on
sustainable agriculture?
– Closed ended
– Example: What methods do you use to educate farmers on
sustainable agriculture?
•
•
•
•
•
•
Field days
Workshops
Seminars
Printed materials
Electronic materials
Others (please specify)___________________
Designing Instruments
Writing Open-Ended Questions
• Things to remember when writing questions:
– Write questions clearly and concisely.
– Start with least sensitive or non-threatening questions.
– Write questions by thinking about the reading level of the
target population.
– Avoid double negatives.
– Avoid double-barreled questions.
– Example: Are you satisfied with the place and time of the
program?
Designing Instruments
Writing Open-Ended Questions
– Open-ended questions are useful to explore a topic in depth.
– However, open-ended questions are difficult to:
• Respond
• Analyze
– Therefore, limit the number of open-ended questions to the
needed minimum.
– When you need to ask a sensitive question it is appropriate
to use a closed-ended question with response categories for
the sensitive information.
– Example: Asking income or age (Ask what is your age group
and provide age categories instead of asking how old are
you?)
Designing Instruments
Writing Closed-Ended Questions
• When writing closed-ended questions:
– Make sure to include all possible response categories.
– If you have not included all possible answer categories, it is a
good idea to include a category called ‘Other’ and provide
instruction to specify what the respondent means under this
category.
– Make sure that your answer categories are mutually exclusive.
– Example: What is your age group?
• Less than 20 years
• 20-30 years
• 31-40 years
• 41-50 years
• Above 50 years
Designing Instruments
Writing Closed-Ended Questions
– Closed-ended questions are:
• Easy to analyze.
• Not exploratory in terms of searching information.
Scale Development
• Develop scales if you need to include in your
instrument.
Guidelines For Scale
Development
• Scales are developed for measuring elusive
phenomena that cannot be observed directly.
Example: Attitudes, Aspirations.
• Therefore, scale development should be
based on the theories related to the
phenomenon to be measured.
• Thinking clearly about the content of a scale
requires thinking clearly about the construct
being measured.
Guidelines For Scale
Development
Generate an Item Pool
• The properties of a scale are determined
by the items that make it up.
• At this stage, you need to develop more
items than you plan to include in the final
scale.
Characteristics of Good Items
•
•
•
•
Unambiguous.
Avoid exceptionally lengthy items.
Consider reading levels of the target respondents.
Include positively and negatively worded items. The
purpose of wording items both positively and
negatively within the same scale is usually to avoid
acquiescence, affirmation, or agreement bias.
Guidelines For Scale
Development
Determine the Format for Measurement
• There are different formats
• Identify the format you would like to use with
your items.
• Determine how many response categories you
need to include in your format.
Guidelines For Scale
Development
Determine the Format for Measurement
• The number of response categories should be limited to
the respondents’ ability to discriminate meaningfully.
• Normally 5-7 response categories are adequate for
extension and education program evaluations.
• Example:
1.
2.
3.
4.
5.
Strongly disagree
Disagree
Neutral
Agree
Strongly agree
Guidelines For Scale
Development
Likert Scale
• Named after Rensis Likert.
• This is the most common format
• The response options should be worded so as to
have roughly equal intervals with respect to
agreement. That is to say the difference in agreement
between any adjacent pair of responses should be
about the same as for any other adjacent pair of
response options.
• Common choices for a mid point include neither
agree nor disagree and Neutral.
Guidelines for Scale
Development
Likert Scale
Example for items in Likert format
1.
2.
3.
4.
5.
Strongly Disagree
Disagree
Neutral
Agree
Strongly Agree
Guidelines For Scale
Development:
Semantic Differential Scaling
• There are several numbers between the
adjectives that constitute the response
options.
• Example: The quality of training session
Poor 1 2 3 4 5 6 7 Excellent
Example For a Scale
(Recording Anxiety)
Some
what
2
Moderately
1. I feel calm
Not at
all
1
3
Very
much
4
2. I am tense
1
2
3
4
3. I feel upset
1
2
3
4
4. I am relaxed
1
2
3
4
5. I feel content
1
2
3
4
6. I am worried
1
2
3
4
Instrument Development
Step 6 Provide Necessary Instructions to
Complete the Survey
• Clear instruction is essential to facilitate
the responding process.
• Instructions should be clearly and politely
stated.
• Clear instructions increase your return rate
as well as accuracy of your data.
When You Develop a
Questionnaire:
• Keep it short, simple, and clear
• Include only needed questions for indicators
• Should be compatible with the reading level
of the respondents
• When you use closed-ended questions make
sure to include all possible answer choices.
Instrument Development
Step 7 Format Your Instrument
• Appearance and editing of your
instrument are important determinants
of response rate.
• Therefore, format, structure, and edit
your instrument professionally.
Instrument Development
Step 8 Establish Validity and
Reliability of Your Instrument
• Reliability refers to the extent to which a measuring
instrument is consistent in measuring what it
measures.
– Test-retest method : We administer the instrument to a
sample of subjects on two occasions and correlate the
paired scores to establish the reliability.
• Validity refers to the extent to which an instrument
measures what it intends to measure.
– Use experts’ views to establish validity.
APPLICATION OF STEPS
Determine Your Evaluation
Questions
• Identify the precise questions need to
be answered.
• Use the logic model to narrow the focus
of evaluation.
LOGIC MODEL
Measuring Program Impact
INPUTS
OUTPUTS
OUTCOMES - IMPACT
Activities Participation
What resources
does your
program need to
achieve its
objectives?
Staff Volunteer
Time & Money
Materials
Equipment
Technology
Partners
What should
you do in
order to
achieve
program goals
and
objectives?
Workshops
Meetings
Camps
Demonstrations
Publications
Media
Web site
Projects
Field Days
Who should
participate
be involved?
be reached?
-
Number of target
clients
Their
characteristics
Their reactions
LEARNING
ACTION
IMPACT
What do you
expect the
participants will
know, feel or be
able to do
immediately
after the
program?
What do you
expect that
participants
will do
differently after
the program?
What kind of
impact can
result if the
participants
behave or act
differently?
Awareness
Knowledge
Attitudes
Skills
Aspirations
Behavior
Practice
Decisions
Policies
Social Action
Social
Economic
Environmental
Possible Question Categories
• Process evaluation questions (These are mostly open-ended)
– Questions on client characteristics
• How do you describe your ethnicity?
– Questions on program delivery
• What are the strengths of this program?
• What are the weaknesses?
• Impact evaluation questions
– Questions on clients satisfaction
• Did the target clients find the program useful?
– Outcomes
• Did the program participants change KASA?
• Did the program participants change their practices?
– Impacts
• Did the participants save money/improve health condition?
What Data Are Needed for
Program Improvement?
• Were participants satisfied with:
–
–
–
–
•
•
•
•
•
Information received
Instructors
Facilities
Quality of training
What do they like/dislike about the training
Did the training meet their expectations?
If not, Why
Ideas for further improvement
Look for data that you can use to fix
weaknesses and build on strengths.
How to Collect Training
Improvement Data?
Please circle the appropriate number for your level of response.
Not Satisfied
Somewhat Satisfied
Satisfied
Very Satisfied
The relevance of information to your needs?
1
2
3
4
Presentation quality of instructor(s)?
1
2
3
4
Subject matter knowledge of instructor(s)?
1
2
3
4
Training facilities?
1
2
3
4
The overall quality of the training workshop?
1
2
3
4
How satisfied are you with:
How to Collect Training
Improvement Data?
•
•
•
•
•
•
Did the training session meet your expectation?
1. Yes
2. No
Would you recommend this training workshop to others?
1. Yes
2. No
If not, why:____________________________________
What did you like the most about this training?
What did you like the least about this training?
How could this training be further improved?
Other Data
Demographics
• What is your gender?
____ Male
____ Female
• How do you identify yourself?
___African American
___American Indian/Alaskan
___Asian
___Hispanic/Latino
___Native Hawaiian/Pacific Islander
___White
___Other
What Data Are Needed for Program
Accountability?
• You need impact data
• To prove that your program achieved
its objectives
How to Document Perceived
Knowledge Change?
Example for Agriculture
How do you rate your knowledge
about:
BEFORE THIS WORKSHOP
AFTER THIS WORKSHOP
Very
Low
Low
Moderate
High
Very
High
Very
Low
Low
Moderate
High
Very
High
Conservation tillage systems.
1
2
3
4
5
1
2
3
4
5
Crop rotations.
1
2
3
4
5
1
2
3
4
5
Weed management under
conservation tillage.
1
2
3
4
5
1
2
3
4
5
How to Document Levels of
Aspirations?
• At the end of a successful training session,
participants will have a heightened level of
aspirations to apply what they learned.
• They are ready to “taking charge” of what they
learned.
• Participants are asked whether they intend to apply
what they learned.
• Example: As a result of this training, do you intend to
drink reduced fat milk? The answers to this question
would be:
–
–
–
–
No
Maybe
Yes
I’m already doing this
How to Document Aspirations?
Example for FCS
Please circle the number that best describes your answer.
No
Maybe
Yes
Already
doing this
1. Eat recommended servings from five food groups?
1
2
3
4
2. Plan meals ahead of time?
1
2
3
4
3. Consume reduced or non-fat milk and dairy products?
1
2
3
4
As a result of this program, do you intend to:
Retrospective Pre and Post
Evaluations
• Advantages:
– Simple & Easy to collect data
• Disadvantages:
– Not appropriate for collecting data from very young
audiences and low literacy adult audiences. Because
they will not be able to compare before and after
situation retrospectively.
Pre and Post Evaluations
• Pre Evaluation is administrated before your training
session.
• Post Evaluation is administrated at the end of your training
session.
• We need to match pre and post evaluations for
comparison.
• Pre and Post Evaluation will document three impact
indicators:
– Change in Knowledge
– Change in Skills
– Levels of Aspirations
How to Document Change in
Knowledge?
• Ask same set of questions before and after
your educational session and compare their
answers to document the knowledge gain
from the program.
How to Document Change in
Knowledge?
Example for FCS
Please circle your answer to each of the following statements.
True
False
Don’t
Know
1. According to MyPyramid, the recommended single serving size of a raw, chopped vegetable is 1/2 cup.
True
False
Don’t
Know
2. According to MyPyramid, the number of servings recommended daily from the Milk, Yogurt & Cheese group is 4 to 5 cups.
True
False
Don’t
Know
3. Daily Values (DV) listed on the bottom of some food labels are the same values for all individuals.
True
False
Don’t
Know
4. The amount that you need from each group of MyPyramid depends on how many calories you need.
True
False
Don’t
Know
5. If you eat more food (calories) than your body needs, the extra calories get stored as body fat.
True
False
Don’t
Know
6. Food high in saturated fat increases risk for heart disease.
True
False
Don’t
Know
7. The Nutrition Facts label on foods tells you how many calories and nutrients are in one serving.
True
False
Don’t
Know
8. Fruits and vegetables contain fiber which helps prevent constipation.
True
False
Don’t
Know
9. Foods that contain protein are located in the meat, milk, and grain groups of the MyPyramid.
True
False
Don’t
Know
10. Vitamin A found in many fruits and vegetables helps our bodies absorb iron.
True
False
Don’t
Know
How to Write Knowledge Testing
Questions:
• Don’t use general knowledge questions.
• Don’t include attitudinal or perceptual statements.
– Example: Growers should practice conservation
tillage. __True __False ___Don’t Know
True and False Questions vs
Multiple Choice Questions
• True and False questions save your time
and respondents’ time.
• Easy to analyze.
• Help you keep your survey short.
How to Document Change in
Skills?
• Skill changes are measured indirectly by
using participants’ levels of confidence to
carry out the learned tasks from the program.
Example: Participants’ confidence in their
ability to calibrate a sprayer.
How to Document Change in
Skills?
• We record their levels of confidence for
carrying out specific tasks before and after
the program on a Likert-type scale.
• Compare pre and post responses to
document changes in skills.
How to Document Change in
Skills?
Example for Agriculture:
Not
confident
A little
confident
Somewhat
confident
Confident
Very
confident
1. Keep waste management records?
1
2
3
4
5
2. Calculate land application equipments?
1
2
3
4
5
3. Calculate nutrient removal levels?
1
2
3
4
5
How confident are you in your ability to:
Pre and Post Evaluations
• Advantages:
– Appropriate for young and low reading
audiences.
• Disadvantages:
– If you want to compare pre and post evaluations
you must match pre and post evaluations for
each participant.
– This is somewhat challenging.
Change Attitudes
• Difficult to measure
• Need to be very careful in
designing scales to measure
attitudes
• Not a practical indicator
• Pre/Post tests
CHECKING ATTITUDES
To what extent do you agree or disagree with each of the
following statements
Statement
Strongly
Disagree
Undecided
Agree Strongly
Disagree
Agree
1
a)Conservation Tillage is profitable.
2
3
4
5
1
2
3
4
5
b) Conservation Tillage is not practical 1
2
3
4
5
(Need to include at least 10-15 items to achieve desired level of validity and
reliability)
How to Document Behavior
Change?
• You need to understand the behavior
change process for designing evaluation
questions.
Understanding Behavior
Change Process
• Behavior change is a process.
• Prochaska and DiClemente developed a model to
explain the human behavior change process. This
model is called the Transtheoretical Model.
• According to the Transtheoretical Model, there are
five stages in behavior change process.
Prochaska and DiClemente’s
Stages of Change
Characteristics
Pre-contemplation
(I’m not considering this)
Not currently considering this change:
"Ignorance is bliss"
Contemplation
(I’m considering this)
Ambivalent about the change: "Sitting on
the fence"
Preparation
(I’m doing this sometimes)
Some experience with the change and are
trying to change: "Testing the waters"
Action
Practicing new behavior or practice
(I’m doing this most of the time)
Maintenance
(This is now a part of my life)
Continued commitment to sustaining new
behavior or practice Stage of Change
Prochaska, J. O. and DiClemente, C. C. (1994). The Transtheoritical Approach: Crossing Traditional Boundaries of Therapy.
Malabar, Florida: Kerieger Publishing Company.
Evaluation Template
For each of the following practices, please circle the number that best describes your current behavior.
Practices
I am not
considering
this
I am
considering
this
I am doing
this
sometimes
I am doing
this most of
the time
I am doing
this all of
the time
1. Drinking fat free or reduced
fat milk.
1
2
3
4
5
2. Doing exercise at least
30minutes/day for five days.
1
2
3
4
5
3. Eating baked, broiled, or
grilled foods rather than
eating fried foods.
1
2
3
4
5
How to Collect Impact Data
from Multi-Session Programs
• “Benchmark Survey” is administrated before the
Extension program.
• “End of Program Survey” is administrated at the end of
the extension program.
• By comparing benchmark and end of program surveys you
will be able to document the change of participants’
behaviors/practices and skills.