Transcript Slide 1

POPULATION RESEARCH SEMINAR SERIES

Sponsored by the Statistics and Survey Methods Core of the U54 Partnership

Administering multisite studies: Pitfalls and pointers

Lee Hargraves, PhD Center for Survey Research UMass Boston

POPULATION RESEARCH SEMINAR SERIES

Sponsored by the Statistics and Survey Methods Core of the U54 Partnership

Questions? Comments?

Type them in or ask over your webcam/microphone Or send an email to

[email protected]

Overview

• • • The advantages of conducting a multi-site study, focusing on behavioral interventions Key components of administering the study with an eye on ensuring fidelity Some "lessons learned" and guidance to address the issue of "if I only knew then what I know now”

Overview

1.

2.

3.

4.

5.

How analysis of a multisite trial sparked interest • A foray into conducting a randomized experiment in community health centers 12 health centers, 6 intervention/6 control • Building on these experiences trying it again 2 health centers, randomizing patients Brief introduction to a model of treatment fidelity Q & A

First Exposure to Multisite Studies

• • • • • 8 adult and 8 pediatric primary care practices Hospitals or health centers Internal medicine, family medicine, nurse practitioners Cycles of quality assurance feedback Measured adherence to guidelines for multiple common conditions – Otitis media, low hematocrit, CA screening, and others

Findings: More than did performance improve? yes or no

• • • • • Multiple sites allow for examining change in varying conditions and settings After QA intervention, full-time practitioners improved more than part-time practitioners Female practitioners were more likely to perform cancer screening among women than male practitioners Individual practitioner performance varies from patient-to patient, from guideline-to-guideline Practice leaders who participated in QA influenced performance among others in the practice Practitioners in larger health centers performed better and improved more than those in smaller centers

The Pros and Cons of Conducting Interventions in Multiple Sites

The advantages of conducting a multi-site study – Increase in sample size, e.g., sometimes it’s hard to find people with specific characteristics at one place – Organizational Size Varies – Diversity (in both people doing the work and the participants) – Location • The disadvantages of conducting a multi-site study, – Organizational Size Varies – Diversity (in both people doing the work and the participants) – Location

Why Do Sites and Organizations Participate?

• • • • • Not for the money Generate new knowledge Satisfy requirements, e.g. accreditation for clinics Recognition Guide developments of new models, cutting edge research

EXAMPLE OF A MULTISITE STUDY: USING COMMUNITY HEALTH WORKERS TO REDUCE DISPARITIES IN DIABETES CARE

Funding from Finding Answers: Disparities Research for Change, An Initiative of the Robert Wood Johnson Foundation

Overview of the Project

An evaluation of the effects of enhanced training of community health workers (CHWs) to reduce disparities in care in a low-income, racially and ethnically diverse population served by 12 community health centers (CHCs)

Collaborative Partners

• • • • Community Health Centers in Massachusetts Massachusetts League of Community Health Centers Central Massachusetts Area Health Education Center & the Outreach Worker Training Institute University of Massachusetts Medical School

     

Community Health Patients

• • • • Nearly 40% are insured through Medicaid Approximately 40% are low-income uninsured and underinsured More than 60% belong to an ethnic, racial or linguistic minority group Fewer than 20% are insured by Medicare or commercial insurance

An Intervention with Community Health Workers

• • • Curriculum developed for health worker training related to the Care Model and diabetes self-management Reviewed and customized for divergent cultural, linguistic, and ethnic groups Training sessions equal to 45 hours

Overview

• • • • • Random selection among health centers, paired for size and patient characteristics Selected and trained Community Health Workers, employed by 6 health centers Supervision of CHWs Structured workbook Data collection – CHW activities – Ongoing data collection using patient registries – Qualitative interviews and focus groups

CHW Study Design & Measurement

• • • Compare sites with trained CHWs pre/post intervention to control community health centers without CHW Measures included clinical data (HgA1c, BP) and encounter data (patients working on goals, # visits) Assess qualitative data via focus groups and intensive interviews in all CHCs – Focus groups with CHWs – Focus groups with supervisors

Training CHWs

• • • • Developed a Community Health Worker curriculum (Outreach Workers Training Institute, Worcester) Developed curriculum to train Supervisors of Community Health Workers Convene CHW training advisory group to review materials Quarterly training sessions, using case studies

Multiple Data Sources

• • • • • PECS—the Patient Electronic Care System – Disease management registry – – It was free Used to enter, store, and report clinical data – Did not test how to export to data file for analysis CHW Encounter Forms Patient Survey Community Health Center Monthly Narrative Reports, and Site Visits

Using Data Collection Tools to Guide Structured Interventions

Lemay, C. A., Ferguson, W. J., & Hargraves, J. L. (2012). Community health worker encounter forms: a tool to guide and document patient visits and worker performance. Am J of Public Health, 102(7), e70-e75.

Unexpected Events

• • • • Health center needed to drop out Health workers were often pulled in other directions Monitoring was more necessary than planned Data collection system, not build for research

Lessons Learned

• • • • • PECS was not designed for research – Disease management registry – Systems on site (e.g., EHRs) are not designed for research Training was key, needed continuing education CHWs were part-time on the project – Valuable assets of health centers – Needed to monitor activity on the study Focus groups: Inadequate preparation of health center, i.e., many didn’t know the study was there Encounter data was essential

The Takeaway

• • • • • • Employing CHC staff was necessary to get buy-in, yet hard to manage Need to do a better job of working with sites, i.e., meet with staff, work with management CHWs need more extensive training in how to work with patients Structured encounters was the bare minimum – Planned activities was helpful, but need specifics for ongoing visits Test your data entry beyond “getting a report” Need better planning for unexpected outcomes

Why Not Try Another Multisite Study?

Community Health Workers Using Patient Stories to Support Hypertension Management Funding from the National Institute on Minority Health and Health Disparities (NIMHD) http://cheir.org/projects/projects-topic-area/community-health-workers

CHWs, Trained in Motivational Interviewing Using Storytelling

• • • • Two health centers Enhanced training in motivational interviewing Delivery of support tools (Videos of patients telling how they manage hypertension) Better data collection strategies

Motivational Interviewing Skills and Techniques

An empathic, gentle, and skillful style of counseling that helps practitioners have productive conversations with individuals with co-occurring and other disorders. • • • • • • • • • • • Essential characteristics of motivational interviewing include: Expressing empathy through reflective listening.

Noting discrepancies between current and desired behavior.

Avoiding argumentation and rolling with resistance.

Encourage the consumer's belief that he or she has the ability to change.

Communicating respect for and acceptance of people and their feelings.

Establishing a nonjudgmental, collaborative relationship.

Being a supportive and knowledgeable consultant.

Complimenting rather than denigrating.

Listening rather than telling.

Gently persuading, with the understanding that change is up to the person.

Providing support throughout the process.

Source: http://www.samhsa.gov/co-occurring/topics/training/skills.aspx

Applying Lessons Learned from Finding Answers CHW Study

• • • • Manageable number of sites – Allow for more contact with staff, including providers More training to work with patients, specifically in motivational interviewing ,was inadequate Build in monitoring, supervision Use better technology for data collection and management

Motivational Interviewing, Key Intervention Component

• • • • • Cannot be taught in a couple of hours, e.g., from a website Is more difficult for some who have extensive training in other techniques Needs to be learned through doing Practitioners need feedback, coaching How do you know if it’s done right?

Extensive Training in MI

More Intense Training in Motivational Interviewing

• • • • Ensuring fidelity in Motivational Interviewing Training, extensive with role modeling, teaching, observing, and feedback Evaluation using “standardized patients” Monitoring, record and observing patient visits – Scoring when health workers employ techniques consistent with MI – Coaching to avoid telling patients what they should do…

Tools to Standardize Both Delivery and Collection of Data

• • • • • • • • REDCap  Research Electronic Data Capture Lives in the internet via a collaborative Available wherever there’s WiFi Design forms for data collection Randomize participants Implement surveys Scheduling function Export data in files that make sense

Online Videos and Training Materials

How REDCap is Used to…

• • • • • • Randomize patients for immediate or delayed entry REDCap and patient support tools on tablet PCs in community health centers (video on Vimeo) Streamlined Electronic Medical Record abstraction Online Surveys to assess health behaviors, literacy, trust, and demographic information, pre-post And, provide a timetable for repeat contact with patients And there’s more…provide a log of user activities

Guiding Evaluation of Interventions

“Researchers and practitioners are typically most interested in the outcomes or results of their study or program. If they have thoughtfully designed the intervention, it is likely that theory has guided the effort. For example, changes in colorectal cancer screening behavior are more likely to occur when there are subsequent changes in knowledge, attitudes, perceived susceptibility to colon cancer, and reduced barriers to obtaining screening”. Source: Glasgow R.E. & Linnan L.A. “Evaluation of Theory-Based Interventions” in Glanz, K., Rimer, B. K., & Viswanath, K. (Eds.). (2008). Health behavior and health education: theory, research, and practice. John Wiley & Sons, 487-508.

Resource to Guide Evaluation

www.wkkf.org/resource-directory/resource/2006/02/wk-kellogg-foundation-logic-model-development-guide

What is Being Evaluated?

Source: Kellogg, W. K. (2004). Logic model development guide. Michigan: WK Kellogg Foundation.

Example of a Logic Model to Guide Design, Plan Evaluation Source: Glasgow R.E. & Linnan L.A. “Evaluation of Theory-Based Interventions,” Ch. 21 in Glanz, K., Rimer, B. K., & Viswanath, K. (Eds.). (2008). Health behavior and health education: theory, research, and practice. John Wiley & Sons..

Lessons Learned

• • • • Use Logic Models to plan evaluation Building Data Collection into the Fabric – You may not know what you need to measure (e.g., hours, time, etc.). Therefore, within reason, try to measure comprehensively Use Enrollment charts (e.g., Targeted Enrollment/Week and Actual Enrollment) Encounter forms are nice but a cloud-based system is better, e.g, REDCap

Lessons Learned

• • • • • Oversight and monitoring, at every step Periodic timesheets to record activities Keep logs of work effort, enthusiasm for the study waxes and wanes – Periodic conference calls – Training seminars for staff, especially those delivering interventions Ensuring Fidelity at each step Count on personnel turnover

On Fidelity

Large literature, e.g., – Bellg, A. J., Borrelli, B., Resnick, B., Hecht, J., Minicucci, D. S., Ory, M., et al.. (2004). Enhancing treatment fidelity in health behavior change studies: best practices and recommendations from the NIH Behavior Change Consortium. Health Psychology, 23(5), 443.

Urging fidelity assessment in the … 1. Design of study 2. Training 3. Delivery of treatment 4. Receipt of treatment 5. Enactment of skills

How to Infuse Fidelity

Spillane et al. recommend many varied treatment fidelity procedures to improve the reliability and validity of research including:  Use of manuals and standardized guidelines for intervention providers.

 Researcher observations through video/audiotaped observation of the intervention (e.g. a consultation).  Self-report evaluation forms/checklists completed after consultations by those delivering the intervention.  Quality assurance checks completed after the consultations by the researcher.  Researcher field notes regarding the study intervention (e.g. requests for additional assistance with intervention implementation and records of all phone and face-to-face contact between interventionists and researchers).   Case conferences that provide a forum for discussion of the intervention. Questionnaire surveys/focus groups/interviews with intervention providers and recipients.  Monitoring and testing study participants’ acquisition of treatment skills. Source: Spillane, V., Byrne, M. C., Byrne, M., et al. (2007). Monitoring treatment fidelity in a randomized controlled trial of a complex intervention. Journal of advanced nursing, 60(3), 343-352.

Finally, Why Multisite?

It Gets Closer to Effectiveness Research

Laboratory “the real world”

Source: From Greenwald and Cullen (1985) as summarized in Glasgow, R. E., & Linnan, L. A. (2008). Evaluation of theory-based interventions. Health Behavior and Health Education: Theory, Research, and Practice. San Francisco, CA: Jossey-Bass, 487-508.

Model of Phases of Research

Source: Greenwald, P., & Cullen, J. W. (1985). The new emphasis in cancer control. Journal of the National Cancer Institute, 74(3), 543-551.

Good luck

• • Others willing to share experiences Q & A.

POPULATION RESEARCH SEMINAR SERIES

Sponsored by the Statistics and Survey Methods Core of the U54 Partnership

Questions? Comments?

Type them in or ask over your webcam/microphone Or send an email to

[email protected]