Transcript Slide 1

Collecting Community Level
Survey Data:
Lessons Learned from Trial &
Error
Liz Lilliott, Ph.D.
National Prevention Network Meeting
September 2009
Introduction
• Brief background on the SPF SIG
in NM
• Community Survey – years 1-4
• Lessons learned
Logic Model
SubstanceRelated
Consequences
High rate of
alcohol-related
motor vehicle
crashes and
fatalities
(Special
emphasis on
underage youth)
Substance Use
Underage
Alcohol Use
Intervening Variables
Easy RETAIL Access
to alcohol for
underage youth
Low enforcement of
alcohol laws
Binge
drinking
Drinking &
Driving
Low perceived risk
of alcohol use and
drinking and driving
Social norms
Easy social access
to alcohol
Intervening Variables vs.
Contributing Factors
• We consider intervening variables (IVs) to be a
broad category of predictors or correlates, in
statistical terms similar to a factor that is made up
of multiple associated measures.
• The IV is comprised of potentially multiple
contributing factors (CFs) that explain why that IV
is important to address.
• While each community may address retail access
of alcohol to youth, the contributing factors as to
why it is a problem in a community may differ,
meaning their prevention strategy may also differ.
For Example
Intervening Variable
Easy RETAIL
Access to alcohol
for underage
youth
Contributing Factors
Strategies
Alcohol retailers
do not
consistently
check ID’s
Responsible
Underage youth
ask strangers to
buy them alcohol
and they comply
There is no
enforcement of
laws prohibiting
selling alcohol to
minors or
providing
Retailer training
Shoulder taps
Greater law
enforcement
efforts to enforce
laws
Greater pressure
on judicial
officers to enforce
consequences
Other Examples of CFs being
addressed:
• Sales of alcohol to intoxicated adults
• Minors obtaining alcohol from friends, family, etc.
• Underage parties
• Support of law enforcement efforts to reduce DWI and
enforcement of aiding & abetting laws
• Norm that underage drinking is a “right of passage”
• Low perception of risk of being caught providing
alcohol to minor or of being caught, arrested, etc. DWI
• Lack of judicial follow through on DWI arrests
Sources of IV Data on CFs
• Archival data such as:
•
•
•
•
•
•
•
Court records
Arrest data
Citation data
Data on sobriety checkpoints conducted
BRFSS
NSDUH
YRRS (YRBSS)
• Primary data collection:
• Community Questionnaire
Why do we need the community
survey if we already have archival data?
• Of the BRFSS, the NSDUH, & the YRBSS, only the
BRFSS is conducted every year
• The length of time to access the data is
considerably long (typically 12 months or more)
• These data are not sufficient to measure change at
the community level. In NM some communities are
counties, but many are smaller than counties, such
as tribal lands, towns, or even neighborhoods in a
city
• Do not include measures of all contributing factors
Goals of conducting a community survey
• To be able to definitively say something about change
in CFs & consumption measures at the state level
and at the community level and attribute the change
to the prevention interventions implemented if at all
possible.
• Therefore, our additional goals were to have large
enough sample sizes at the community & state level
to measure change & to have the samples be
representative of the communities
The Community Questionnaire
• Same survey used in all 15 SPF SIG communities &
non-SPF SIG communities for comparison
• Includes measures of those contributing factors for
which we do not have archival data at the
community level
• Includes the National Outcome Measures (NOMs)
required by CSAP including measures from the
BRFSS & NSDUH & YRBSS
SPF SIG Community Questionnaire
Sources of IV Data
• Social access for minors
• Where did they get alcohol
• Where did they drink alcohol
• Attendance at keg parties
• Perception of risk
• How likely police are to:
 Break up parties where teens are drinking
 Catch/arrest/convict you drinking and driving
• Norms
• Support for law enforcement efforts
• Exposure to media messages about efforts
• How harmful is drinking too much
Some ideals (assumptions) we had
for the survey process going in:
the sample would be large enough at the community
level to be used by communities for needs
assessment & evaluation purposes as well as at the
state level
the sample would be random & representative of the
communities
the comparison communities would be matched to the
SPF SIG communities for a stronger design
 we’d get good baseline data
 we’d use the same survey method every time
because…
we’d be successful the first time
What actually happened
The first attempt:
• 2006-2007 (Interventions began in 2006-2007)
• This was to be our baseline data collection
• Survey targeting 18 to 25 year olds in SPF SIG & nonSPF SIG communities (not matched)
• Phone interview- using random digit dialing (RDD)
• 398 questionnaires were completed
• Cost: $60K
• Average age: 20.9
The first attempt:
The pluses:
• No burden of cost or time on communities
• No burden on the evaluators
The drawbacks:
• Not a representative sample
• Not a large enough sample to be useful to
communities or to the state
• Cost: $151 per completed survey
• Method not appropriate for target age group nor
cultural characteristics
What actually happened
The second attempt:
•
•
•
•
•
•
•
•
2007-2008
No money to conduct a phone survey
Had to get communities involved
With State Epi Workgroup we redesigned the survey,
changing some questions and made it fit a written
format
Survey targeting 18 to 65 year olds in SPF SIG & nonSPF SIG communities (not matched)
Tried to get a more random/representative sample by
recruiting at MVD offices in communities for an on-line
survey or a phone survey
Had one open ended question
Average age for SPF SIG 36.2 years; n= 2954
The Recruitment Process
• Received permission & support from the Director of
MVD to recruit at state run MVD offices
• Letter sent to MVD office supervisors asking them to
cooperate with local prevention folks to recruit
• Trained preventionists on how to train the MVD staff
on how to recruit
• Requested clients to complete a card indicating that
they wished to be contacted by email or phone to
complete survey. Provided 1st name, email or phone
#. These were sent to PIRE on a weekly basis.
• Invitation emails were sent, phone calls were made
• Reminders were sent
• Incentives for MVD staff, incentives at the MVD,
incentives for completing the survey
The second attempt:
The pluses:
• We increased our overall sample size considerably
• Improved our representativeness in those communities where it
actually worked.
• Local communities partnering with MVDs created prevention
allies
• Gave communities an appreciation (understanding) of data
gathering and what’s involved
• Responses to the open ended questions were powerful
The drawbacks:
• Not a representative sample in most communities
• Not a large enough sample in most communities to be useful
• Method not appropriate for some communities without MVD
offices; MVD offices are not all participatory
• Very labor & time intensive; complicated. If one link was broken
it all broke down.
What actually happened
The third attempt:
• 2008-2009
• Had to get communities involved but had to make it
simpler if we were to survive
• Survey targeting 18 & over in SPF SIG & non-SPF
SIG communities (not matched)
• Placed greater emphasis on face-to-face surveying
• Recommended recruitment strategies to increase
representativeness and decrease bias of the sample
but knew this was unlikely
• Eliminated phone survey completely
• Internet survey recruitment card provided a direct link
to the survey
• Average age for SPF SIG: 39.2; n = 7011
The Recruitment Process
• We asked the programs to identify themselves into 1
of 5 groups relative to how successful they were the
year before
• Recommended locations for them to recruit
• As part of the planning process, programs created
community specific data collection protocols for
completing paper &/or internet surveys
• Provided a target # of completed surveys for each
community
• Provided detailed training & documentation for
communities of data collection protocols, roles,
responsibilities, etc.
The third attempt:
The pluses:
• We increased our overall sample size considerably
• Improved our within community sample sizes
• Local communities partnering with local businesses &
stakeholders strengthened prevention allies
• Communities were successful & empowered- more
sustainable for future data collection
• Good cooperation between entities (state, evaluators,
prevention providers)
• More culturally appropriate
The drawbacks:
• Still time consuming & labor intensive for communities &
evaluators, but better results
• Can be expensive for program esp. in staff hours & travel
• Sacrificed representativeness for larger sample sizes
The fourth attempt:
• Will take place February – March, 2010
• Keep everything the same as last year
• This fall we will revisit local level data collection protocols
and communities will revise as needed
• We will re-train everyone again on recruitment protocols
• We will spend more time working with comparison
communities in particular and monitoring their progress
• Try to get MVD & electric company to recruit through their
correspondence
The many lessons learned
Planning:
• Easily ¾ of your effort will be in planning training, &
monitoring the data collection process
• It is critical have a global plan (state level) as well as local
plans (community level) for how data collection will take
place
• Acts of God will happen but you can try to plan for some
problems; consider the weather issues, the school
schedules, the holidays, etc. that may affect your data
and/or data collection
• Keep the plans as simple as possible & eliminate
bureaucracy when you can
• Get permissions & approvals early!
The many lessons learned
Planning:
• Use/build connections & collaborate whenever possible
• Find volunteers to help, just make sure they are well
trained
• Community level buy in is critical; do whatever it takes to
get it (e.g., Native American communities)
• It can be difficult for local staff to understand the
importance of data collection; create that big picture for
them
• Find your extroverts to help with recruiting (responsible
ones)
• Be as culturally sensitive as you can be without completely
compromising the process
• At the local level, try not to have staff dual task. Staff
responsible for data collection need to focus on just that.
The many lessons learned
Planning:
• Provide a community specific target or goal for completed
surveys
• Create an incentive or reward system to keep staff
motivated. This can be as simple as a chart that indicates
progress towards reaching goal
• Establish roles & responsibilities for those at the state level
& local level at the very beginning
• Define resources used to finance data collection
The many lessons learned
Data Collection:
• Monitor progress toward reaching community target goals
& state target goals
• Don’t delay in beginning data collection, it will always take
longer than you think it will
• Follow the plan, but if it’s not working, revise it so it does
and keep that revision for next time
• Don’t use people who haven’t been directly trained or
underage youth unless there is someone overseeing them
directly; the need to be knowledgeable about the process
& the survey itself
• Always have a consent form/explanation document to
provide to participants
The many lessons learned
Data Collection:
• It’s hard to overcome our biases when approaching people
to participate, but we absolutely must; provide strategies to
recruiters on how to recruit participants to be more
representative.
• Incentives should be culturally appropriate and not
coercive
• Often local establishments will donate small incentives if
asked.
• Protect anonymity of respondents
The many lessons learned
Data processing & distribution
• Data entry folks should be trained ahead of time; but there
are still likely to be data entry errors so cleaning the data is
very important.
• The main incentive for a community to participate in data
collection is to get data that will be useful in planning
community level interventions. Therefore, getting the data
to communities is very important.
• You can do this several ways. One is to provide the data
to them. This is fine, if there is someone who can analyze
data and present it.
• Alternatively, you can create presentations, or provide
slides, graphs & interpretation for them to use in
presentations to their stakeholders or for use when writing
grants, reports, etc.; Make it user friendly.
The BIG lessons learned
• Our success has grown as we’ve become more culturally
competent and worked with communities; therefore, keep
a good balance between flexibility and direction.
• You can’t please everyone, but you try to be
accommodating when you can.
• Which goals are most important if you have to sacrifice
something?
• In the spirit of community based participatory research,
community involvement in the planning process from the
beginning is important. It may take longer, but it means
there’s ownership of the process and a desire for it to be
successful.
The BIG lessons learned
• Transparency of how decisions are made is important.
Ideally decisions are not top down.
• Do not underestimate the importance of piloting the
survey & the data collection process.
• Help communities to understand how to use the data for
needs assessment, planning and implementation and
not just evaluation.
• Use data in media or social marketing campaigns
• To encourage law enforcement to increase
enforcement
• To create buy-in for prevention efforts from local
authorities
• For use in Local Epi Workgroups
• www.nmprevention.com
• Under New Mexico SPF SIG - Project Documents
• Contact Information:
Liz Lilliott, Ph.D
PIRE
[email protected]
505-765-2330
Martha Waller, Ph.D.
PIRE
[email protected]