Web Usage in a Business Panel Survey ICES-III, Montreal Canada June 21, 2007 David Marker and Janice Machado, Westat (U.S) [email protected].

Download Report

Transcript Web Usage in a Business Panel Survey ICES-III, Montreal Canada June 21, 2007 David Marker and Janice Machado, Westat (U.S) [email protected].

Web Usage in a Business Panel Survey
ICES-III, Montreal Canada
June 21, 2007
David Marker and Janice Machado,
Westat (U.S)
[email protected]
Overview
 Terrorism Risk Insurance Program (TRIP) Surveys were
conducted by Westat for the U.S. Department of the
Treasury
 Data needed to report to the US Congress on:
 The effectiveness of TRIP
 Capacity to offer terrorism coverage after the TRIP
sunsets, availability, & affordability
 Terrorism risk insurance premiums
11/6/2015
2
Overview (contd.)
 Collected panel survey data on property, casualty, and
workers’ compensation insurance from national samples
of 3 types of organizations
 Re-insurers (suppliers of insurance to insurers)
 Insurers (suppliers of insurance)
 50 page questionnaire
 1 month calendar time
 Insureds/policy holders (purchasers of insurance)
 25 page questionnaire
 1 to 4 hours to complete
11/6/2015
3
Overview (contd.)
 Surveys conducted in 3 waves
 Nov’03-Feb ’04
 Oct-Dec ’04
 Feb-Mar ’05
 Several versions of the instrument for Waves 2 and 3
 Data collected via multiple modes:
 Web
 Mail
 Facsimile
11/6/2015
4
Topics





On-line vs. back-end logic checks
Handling multiple respondents per instrument
Relative use of hard copy vs. web
Getting complete information from partial completes
Analyzing data quality issues during data collection
11/6/2015
5
On-line versus Back-End Checks
 Hard copy and web surveys were returned
 Hard copy returned surveys were entered into the web by
Westat staff
 Ensured that both web and hard copy surveys went
through the same edit checks
 However, the Westat staffer, not being the respondent,
was not able to change a response that failed an online edit
11/6/2015
6
On-line versus Back-End Checks (contd.)
 If we could think of an edit prior to the start of data
collection, we added it to the on-line web program.
 If we missed an edit, we added it during post-data
collection (back-end) processing
 We added to on-line edits in subsequent waves based on
our experience in earlier waves
11/6/2015
7
On-line versus Back-End Checks (contd.)
 Mostly soft edits were programmed for items
 Respondents were prompted when they entered a
response that failed an edit
 If the response entered on the 2nd attempt failed a soft
edit, the web program simply accepted the respondent’s
2nd entry provided it fell within the hard edit
 Hard edits required allocated revenue across regions to
add to 100%, or it was not accepted
11/6/2015
8
Types of On-line Edits
 Edits included:
 Range edits which specified an upper and lower value
for each item
 Skip edits that jumped the respondent to the
appropriate next question based on the response
provided to an earlier question
 Logic edits that prompted the respondent when a
subsequent response contradicted an earlier response
11/6/2015
9
Final On-Line Edit
 Edits prior to permitting the respondent to exit the survey
 The program checked to see if there were valid, nonmissing responses to all “critical” questions in each
survey
 If missing or invalid, the questions and responses
were displayed and one final attempt was made to
obtain or correct the information recorded
11/6/2015
10
Back-End Edits
 These were mainly limited to logic edits found to be
necessary post-data collection
 The edits reviewed responses provided to dependant
questions
 If the responses were inconsistent, one or more
responses were set to missing or the organization was
contacted and a new value for the item was obtained
11/6/2015
11
Handling Multiple Respondents
 Questionnaires asked for organizational, financial, and
insurance data
 Often one person in the organization could not respond
to all topics
 Allowed multiple respondents to complete
 Recorded names and contact information
 Also recorded one person who could respond to any
follow-up questions
11/6/2015
12
Handling Multiple Respondents (contd.)
 Provided the Web User ID and Password to just one
senior contact in each organization
 Responsible for dissemination (even multiple
locations)
 Ensuring completion and internal QC
 Program allowed a new respondent to jump to the
section they wanted to complete using pre-programmed
tabs
 Program required some sections/questions to be
completed before others
11/6/2015
13
Web vs. Hardcopy Responses
Insured W2
Insured W3
Insurer W2
Insurer W3
Web
Not Web
%Web
2,051
510
96
72
1,853
452
80
38
53%
53%
55%
65%
Web includes multiple mode
Not web includes mail and fax
11/6/2015
14
Obtaining Missing Information
 Web surveys permitted respondents to indicate they did
not know or did not want to provide a response to any
question
 Client identified a list of questions that needed a
response for the survey to be considered a complete
 E-mail and phone communications used to obtain
missing information
11/6/2015
15
Obtaining Missing Information (contd.)
 Respondents were allowed to re-access their completed
survey and enter missing information
 Westat analysts also updated the web survey with
missing information retrieved activating any on-line web
edits
 Back-end edits were then run on the data
 Increased the number of completed surveys
11/6/2015
16
Real-Time Data Quality




Review item nonresponse from early cases
Identify very problematic questions
Reminder email to answer all subparts
Reduced item nonresponse and need for data retrieval
and imputation
11/6/2015
17
Item Response Rates
B2. Please indicate if the statements below describe the
organization. Is the organization selected a …
a. Subsidiary of another U.S. firm?
b. Subsidiary of a foreign-owned firm?
c. Headquarters of your organization
in the U.S.
d. Something else?
(Specify) ___________
YES
NO
1
1
2
2
1
2
1
2
11/6/2015
18
Effect of Email Clarification
Sent to 1/4th of sample for whom we had email addresses
“When answering question B2, please answer 1 or 2 for
EACH of the four subparts of this question.”
Table 1. Percent of respondents who needed coders to clean up question B2.
Percent of responses requiring coder cleaning
Before email (n=2,269)
After e-mail (n= 986)
B2a
10.4%
6.6%
B2b
11.7%
6.3%
B2c
2.9%
1.0%
B2d
34.7%
23.9%
11/6/2015
19
Updating Skip Patterns
 Respondents edited earlier responses
 Need to update skip patterns real time to identify newlyrequired items
11/6/2015
20
Prefill Items with Information
from Prior Waves
 Each survey was conducted in 3 waves
 Surveys collected extensive numeric data (financial,
insurance, etc.)
 Respondents changed between waves
 To reduce errors in how values were reported between
waves and by different respondents subsequent waves of
the survey showed responses that were provided in
earlier waves
11/6/2015
21
Prefill Items with Information
from Prior Waves (contd.)
 Prefills done only on select items where there was an
increased probability of error. For example, one
respondent providing numbers in ‘000s, and another in
millions
 Showing a previous response helped improve the quality
of data collected on the subsequent round
 Also allowed respondents to fill in earlier blank cells in
the matrix
11/6/2015
22
Conclusions




Web provides opportunities and dangers
Ease burden on respondents, especially in large org.
Allows for real-time data quality improvements
Lose control on data after they are initially received from
respondent
11/6/2015
23