A Framework for Quality Indicators in Vocational

Download Report

Transcript A Framework for Quality Indicators in Vocational

SuRGE-5’s User’s Guide:
Survey Design within VR
Performance Management
7th Summit Conference
Louisville, Kentucky
September 2014
Darlene A. G. Groomes, Summit Reading Group Facilitator
Fundamental Group
Karen Carroll, Andrew Clemons, Elaine De Smedt, HarrietAnn Litwin, Matthew Markve, Janice
McFall, Sukyeong Pi, Kellie Scott, Michael Shoemaker
Principal Group
Daniel Frye, Elisabeth Furber, Shelley Hendren, Russ Thelin, Ed Tos
SuRGE: Summit Reading Groups for
Excellence
• 5th Group: Fundamental and Principal Introductions
• Learning Community of Summit Group
• Professional Development and Training: Ten months and three
combined calls
• Represented General, Combined, and Blind VR Programs; TACE;
Institutes of Higher Education; Disability Law & Advocacy
• Represented seven states across the nation
• Read Internet, Mail, and Mixed-Mode Surveys: The Tailored Design
Method by Dillman, Smyth, and Christian (2009)
• Session PowerPoint, User Guide, Interactive Element, and Before/After
Survey will be available at: http://vocational-rehab.com/summitreading-groups/project-development/
Vocational Rehabilitation Performance Management: A
User’s Guide to Survey Design
Principal Group Comments
• To provide VR agency leaders and staff with a resource
• Improving performance through enhanced surveying methods
• Reviews survey design issues
• Explains different modes utilized to increase response rate
• Focus toward responders’ needs
• Went beyond the book to include technology considerations
and satisfy accessibility issues for persons with disabilities.
• To receive data from surveying efforts that are complete,
valuable, and reliable.
Internet, Mail, and Mixed-Mode
Surveys
• When studying VR systems, surveys have often
been used as a means to collect data.
• Such data has been used for various purposes
• identify patterns
• evaluate performance
• identify areas for improvement
• determine the future needs of individual VR
agencies and the overall VR system.
Survey Evolution
• Significant changes have occurred over the
years with respect to how surveys are
conducted and what modes or types of
surveys are used.
• It is crucial for those conducting surveys to be
aware of the various survey modalities and
what they have to offer.
• Shift in recent years
“Mixed-Mode” vs. “Single-Mode” surveys
“Mixed-Mode” vs. “Single-Mode” Surveys
• No longer assume a one-size-fits-all approach
• Limitations in what a single mode survey may offer, such as
response rate
• Response rate for telephone interviews has decreased
• decline of land lines
• use of Caller ID to screen out phone calls
• cell phone users not wanting minutes charged to their cell
phone bill
• Mixed-mode
• Increased respondent rate
• Allows individuals to respond to a particular type of survey
based on disability related needs
Specific Types of Survey Modes
• In-Person Interview
The individual interviewed in person at their home
or at another physical location.
• Mail Survey
The respondent asked to send their response back
in a self-addressed stamped envelope.
• Telephone Survey
Can be automated or conducted by live
interviewers.
Specific Types of Survey Modes continued
• Computer Assisted Personal Interviewing (CAPI)
Responses entered into computer program on a
laptop/other small computing device.
• Computer Assisted Self-Interviewing (CASI)
The respondent enters own responses into
computer/hand held device.
• Email Surveys and Internet Surveys
Sent directly to a respondent’s email address or hosted on
a website.
• Fax
Despite the decreased use of fax machines, for those who
do not feel comfortable responding electronically.
Encouraging Participation
Provide Information about the Survey
• Surveyor clearly identifies who they are, the survey’s purpose,
how the survey will be used, provides contact information, and
offers assistance
• Build a positive relationship – rapport
Ask for Help or Advice
• Demonstrates the value of thoughts and opinions
• Improves survey design and accessibility
• Solicit input or advice from experienced surveyors inside and
outside of VR
Thank Respondents
Appreciate respondents’ time and efforts:
• Thank participants throughout the survey process
• Be considerate of participants’ time
• Follow-up after completion
Support Group Mission and Values
•
•
•
•
Clearly state the agency’s mission and values
Align the survey with mission and values
Inform respondents how the survey advances agency goals
Solicit respondents’ input
Make the Questionnaire Interesting
• Consider how wording and context affects interest in the survey
• Explore formatting options which enhance aesthetic appeal
• Accessibility considerations
Use Plain Language
•
•
•
•
Avoid acronyms and agency-specific lingo
Be brief, objective, clear, and precise
Avoid ambiguous or redundant answer choices
Consider how respondents may misinterpret questions and
answer choices
Sequence Questions Logically
• Test survey to ensure logical flow
• Group questions by content area – avoid revisiting topic areas
• Use numerical or alphabetical order, when possible
Create an Incentive or Reward System
• Offer incentives whenever possible – respondents are offering
time and effort
• If financial resources prohibit offering incentives, share how
respondents’ efforts advance the organization’s mission
Decreasing Barriers
Make Responding Convenient
• Simple - opening an envelope, clicking a link
• Keep surveys concise – respect time
• Test survey across devices, platforms, and populations
Use Respectful and Comprehensible Language
• Be considerate of respondents’:
• Education levels
• Content-area knowledge
• Target 4th Grade reading level
• Carefully review assumptions and cultural biases
Create a Short Questionnaire
• Avoid unnecessary, wordy, or overly detailed questions
• Keep instructions clear and concise
• Completion time: 5-10 minutes
Create a Simple Questionnaire
• Minimize the amount of answer choices per item
• Provide simple, essential directions
• Make answering easy (e.g. checkboxes)
Minimize Personal Information Requests
• Maintain confidentiality and anonymity
• Clearly describe procedures for protecting personal information,
if necessary
Pilot Test All Surveys
• Surveys benefit from continuous improvement
• Gather input from colleagues, experts, respondents, and
stakeholders before roll-out
• Improve validity and utility through small group field tests
Visual Design
There are three responses to a
piece of design – yes, no, and
WOW! Wow is the one to aim
for.”
Milton Glaser
“
Design Elements
• Words: need to convey unambiguous meaning
and to convey information
• Numbers: sources of order and sequence
• Symbols: used to convey meaning or
relationships that must be understandable to
the respondent
• Graphics: shapes and visual additions that
provide simple or complex information and
meaning to the respondent
Design Properties
• Size: Affects impact and
meaning/IMPORTANCE
• Font: Needs to be legible
• Color and Shading: Determines how elements
stand out and relate to each other
• Location: Impacts legibility and conveys how
elements are related to each other.
(or are not)
Grouping Principles
• Symmetry: Regularity and balance
• Proximity: Nearness implies relatedness
• Similarity: visually similar will be grouped together
• Connectedness: connected will be grouped together
• Common Region: elements in a closed region will be grouped together
• Continuity: perception that one element leads to the next
• Closure: elements in a closed region will be perceived as grouped together
• Common Fate: elements perceived as moving in the same direction will be
grouped together
Visual
Guidelines/Recommendations
1. Use bold text for question and not bolded text for
responses.
2. Use space to delineate groupings
3. Make responses visually neutral
4. Emphasize important elements in the question
(e.g. underlining important phrases)
5. Use a design element in a consistent manner
throughout the survey.
6. Make visual and verbal messages consistent
Example of last point
Asks for the best (single)
answer, but appears to
have two sets of answers ,
calling for two responses.
vs
Asks for the best (single)
answer from a single list
of responses.
More Visual
Guidelines/Recommendations
7. Put special instructions in the question rather
than placing them in some other location.
8. Use an alternate visual cue to identify
occasional or special instructions (e.g. italics).
9. Organize the information to be brief and to
the point.
10. Choose line spacing, font and text size to
ensure legibility of the text.
Choosing Your Words
“Words mean more than what is set down on paper. It takes
the human voice to infuse them with shades of deeper
meaning.”
Maya Angelou, I Know Why the Caged Bird Sings
Two Kinds of Questions
• Open-ended questions:
specific answer choices are not
provided
• Closed-ended questions:
specific answer choices are provided
Use Open-Ended Questions:
1. To elicit a numerical response
Guidance: label the unit of response (e.g. days,
hours, boxes, etc.)
2. Request for a list:
List your three favorite cereals
Cereal 1
Cereal 2
Cereal 3
Use Open-Ended Questions
continued
3. When you want detailed, descriptive
information.
Guideline 1: Use judiciously as narrative
responses require individual coding.
Guideline 2: Allow sufficient space for
the respondent to give a full answer.
Indicate if the text box will permit
scrolling.
Use Close-Ended Questions
1. In a neutral manner:
Rather than “do you agree?” state “do you
agree or disagree?”
2. When using a Nominal Scale (asking to choose
from one or more responses):
Guideline: when asking individuals to rank a
series of items, you are more likely to get an
accurate response if you individually pair each
of the items.(e.g. A vs. B; B vs. C; A vs. C)
Nominal Scales
continued
• Responses increase when the individual is asked to
state yes or no to each item (rather that check “all
that apply.”)
Indicate if you have visited each of these restaurants in the
past 6 months:
YES
Pizza Hut
Taco Bell
McDonald’s
Burger King
NO
Use Close-Ended Questions
3. Ordinal Scales measure gradation, either starting at
“zero” (unipolar) or with a neutral option in the
center “bipolar.”
Unipolar
Bipolar
Excellent
Strongly Agree
Very Good
Agree
Good
Neutral
Fair
Disagree
Poor
Strongly Disagree
Very Poor
More about Bipolar Scales
• Because respondents relate to the middle value as
“average,”
Do this:
Not this:
Strongly Agree
Strongly Agree
Agree
Agree
Neutral
Neutral
Disagree
Disagree
Strongly Disagree
Strongly Disagree
Don’t Know
Don’t Know
No Answer
No Answer
Technology Considerations
Know your Audience
• Match your survey type to those being surveyed
• Advantage of using mixed mode surveys when there
may be limited access or comfort level with a web
based or electronic format
• Consider how an electronic based survey may have
an impact on confidentiality and response rates
Not all Internet Browsers are Alike
• Older computers may not be equipped to run upto-date browsers
• Different operating systems at play (i.e., Windows
vs. Apple)
• The impact of how surveys may be answered on
different devices (i.e., Desktop systems,
Iphones/Ipads, smartphones (Android) and
tablets)
The Use of Internet-Based Survey Sites
• Survey Gizmo
http://www.surveygizmo.com/
• Survey Monkey
https://www.surveymonkey.com/
• Qualtrics
http://www.qualtrics.com/
Disability Accommodations
Back to Basics- Guidelines for Creating
Surveys
• The Rehabilitation Act of 1973 as amended,
Section 508 (U.S. Code, Title 29, Section 794d)
• Surveys given by a VR agency should allow for
individuals with disabilities to have equal
access through the use of reasonable
accommodations and/or alternative formats.
Common Guidelines
• Use systems compatible with screen readers
• Allow for the ability to modify font size, color
etc., provide text equivalents for every nontext element
• Create alternative tags for audio features to
allow those with hearing difficulties to read
what is being conveyed in the audio format
Common Guidelines
continued
• Use systems compatible with speech
recognition software
• Enable page shifting without the use of a
mouse, and allow for keyboard alternatives
for mouse commands
• Allowance for additional or needed
response times to survey questions
Common Alternative Formats or
Accommodations
• Surveys constructed in Braille format
• Telephone relay systems
• The use of readers or interpreters
• Examples of Section 508 Compliance in Internet
surveys:
http://help.surveymonkey.com/articles/en_US/kb/Areyour-surveys-508-compliant-and-accessible
Mixed-Mode Surveys
• The use of several different modes in a survey
procedure
• Examples of several different survey modes
would include phone, on-line, mail and point
of contact surveys
• The content of the survey changes minimally
across modes so that data from the several
modes used can be analyzed together
Potential Advantages to Using MixedMode Surveys
1. Reduced Cost
2. Improved Time Lines
3. Reduced “Coverage” Error
Potential Advantages
continued
4. Improved Response Rate and Reduction
of Non-Response Error
5. Reduce Measurement Error
Type
1. Use of one mode to contact
respondents and to encourage
response by a different mode
Motivation
Improve response rates
Increased implementation
costs
Reduce coverage and
nonresponse error
2. Use a second mode to collect Reduce measurement error
responses for the same
respondents for specific
Reduce social desirability bias
questions within a survey
for sensitive questions
3. Use alternative modes for
different respondents in the
same survey period
Limitations
Improve response rates
Reduce coverage and
nonresponse error
Reduce survey costs
4. Use a different mode to
Different modes become
survey the same respondents in available to survey respondents
a later data collection
Reduce survey costs
Increased design costs
Increased nonresponse if
respondent must respond by
other mode at a later time
Increased design costs
Measurement error from mode
difference that may be
confounded with difference
among subgroups
Increased design costs
Measurement error from mode
differences that impact the
ability to measure change over
time
Customer Satisfaction Specifics and
Delivery Methods
Customer Satisfaction Specifics
• Discuss 10 guidelines for improving customer
satisfaction surveys by reducing errors in key areas:
• Sampling methods
• Nonresponse
• Measurement/interpretation
Delivery Methods
• Describe 3 survey methods which enhance accuracy
• In-person appeals with follow-up
• Customer diaries
• Group administration
Customer Satisfaction Specifics
• Creating valid customer satisfaction surveys can be
a challenge; often the individuals designing the
surveys are those being evaluated.
Sampling Methods
1. Randomly sample populations instead of the entire
population, minimizing individuals being surveyed
repeatedly/unnecessarily.
2. Develop procedures to ensure that onsite sampling is
carefully executed and is not affected by personal
preference.
Customer Satisfaction Specifics
Nonresponse
3. Actively use follow-up reminders to reduce non-response
error.
4. Provide all respondents with similar amounts and types of
encouragement.
Measurement/Interpretation
5. Avoid encouraging higher ratings when delivering the
survey request.
6. Obtain responses when customers are best able to provide
them.
Customer Satisfaction Specifics
Measurement/Interpretation
7. Choose measurement devices which are credible to
respondents and surveyors.
8. Avoid choosing measurement devices primarily because
of their potential to improve response rates.
9. Ensure scales are balanced and fully describe the
measurement procedure when reporting survey results.
10. Evaluate the impact of using both aural and visual
modes.
Delivery Methods
• Delivery methods enhance or undermine
the validity of survey results
• How you deliver affects what you
receive
• Selecting the appropriate delivery method
reduces sampling, nonresponse, and
measurement errors
• Delivery methods must match desired
feedback
Delivery Methods
In-Person Appeals with Follow-Up Procedures
• “Foot-in-the-door technique”
• Increases validity by decreasing nonresponse error
Best practice implementation in stages:
1. Initial contact and in-person appeal
2. Follow-up contacts with:
a) Reminders and/or thank you messages
b) The survey itself
• US National Park Service
Delivery Methods
Customer Diaries
• Self-reporting behavior during a determined timeframe
• Tracks behavior/use in addition to (or in lieu of)
satisfaction
• Procedural Timeline for Diary Surveys:
a)
b)
c)
d)
e)
Initial contact explains purpose, timelines, etc.
Advance contact prior to mailing diary (e.g. 2-4 weeks)
Mailing: diary, review of procedures, and gratitude/incentives
Reminders to customer prior, during, and after the diary period
Final contact expresses gratitude, offers incentives, and/or
confirms receipt of completed diary
Delivery Methods
Group Administration
• Ideal when timely feedback is necessary –
when customers’ memories must be fresh
• Guided group interviews can promote
discussion - paper surveys allow anonymity
• Mixed-modes (e.g. paper, discussion,
audio/video recordings) create robust data
• Skilled facilitation a key priority
For More Information
Darlene Groomes
248-370-4237
[email protected]
Summit Group Website
www.vocational-rehab.com
Fundamental Group
Karen Carroll, Andrew Clemons, Elaine De Smedt, HarrietAnn Litwin,
Matthew Markve, Janice McFall, Sukyeong Pi, Kellie Scott, Michael
Shoemaker
Principal Group
Daniel Frye, Elisabeth Furber, Shelley Hendren, Russ Thelin, Ed Tos
Thank you!