Building service networks: the impact of funding on

Download Report

Transcript Building service networks: the impact of funding on

Building Professional Networks to Support
Implementation of Evidence-Based Mental
Health Services
Alicia Bunger
Byron Powell
Ohio State University
Washington University in St. Louis
Rochelle Hanson
Nathan Doogan
Medical University of South Carolina Ohio State University
Yiwen Cao
Jerry Dunn
Ohio State University
University of Missouri-St. Louis
Funding: NIMH (R25 MH080916-01A2, T32 MH019117; F31 MH098478), VA (QUERI)
Purpose
Examine change in professional adviceseeking patterns among mental health
clinicians participating in a learning
collaborative for implementation.
Learning Collaborative Models
• IHI’s Breakthrough Series
Learning Collaborative
• Quality Improvement
• Teams from multiple agencies
• Emphasizes shared
learning
• Stimulating interactions
• Within & Across organizational
teams
• Are they effective? How?
• Mixed evidence (Schouten et al, 2008)
• “Black Box” (Mittman, 2004)
Preparatory Work
• Expert Panel
• Commitment
Active Learning
• In-Person Learning Sessions (3)
• Plan-Do-Study-Act (PDSA) cycles
Supports
• Team Calls
• Web Support
• Quality Improvement Techniques
(IHI, 2003; Nadeem, et al, 2013)
Social Networks and Implementation
• LCs may support implementation by building social
networks within and across participating agency
teams.
• Networks are conduits for technical information and social
support
3 Ways LCs May Build Networks:
Opportunities
for Interaction
Technical info –
knowledge/skill
Clinician
IntraOrganizational
Support
Interorganizational
support, New
ideas, Referrals
Content
Experts
Peers at
Home
Agency
• Learning
Sessions
• Consultation
• Calls
• Learning
Sessions
• PDSAs
• Learning
Sessions
• Group Calls
LC Peers
at other
agencies
Do learning collaboratives “rewire”
social networks in a way that
supports implementation?
AIMS:
1. Assess change in the composition of clinicians’
professional advice networks over the duration of a
learning collaborative.
2. Examine how changes in clinician advice seeking
patterns alter the structure of the regional network.
Study Setting
Preparatory Work
• $2 million regional initiative to
• Expert Panel
• Commitment
implement TF-CBT funded
through a county-based tax levy
• 32 Children’s behavioral health
Active Learning
• In-Person Learning Sessions (3)
• Plan-Do-Study-Act (PDSA) cycles
agencies
Supports
• Community-based trainers,
certified by NCTSN as TF-CBT
therapists
• Team Calls
• Web Support
• Quality Improvement Techniques
Study Setting
Preparatory Work
• $2 million regional initiative to
• Expert Panel
• Commitment
implement TF-CBT funded
through a county-based tax levy
• 32 Children’s behavioral health
agencies
• Community-based trainers,
certified by NCTSN as TF-CBT
therapists
• Enhanced Learning
Collaborative Model
Active Learning
• In-Person Learning Sessions (3)
• Plan-Do-Study-Act (PDSA) cycles
Supports
• Team Calls
• Web Support
• Quality Improvement Techniques
Enhanced Features
•
•
•
•
Coaching Calls
On-Site Visits
Local Trainers
Rostering
Method
Sample
• 132 participants from 32 agencies (with pre & post data)
• 90% of Learning Collaborative completers
Discipline
Role
Experience
Social Work
53%
Sr. Leader
10%
GT 5 yrs in field
65%
Counseling
28%
Supervisor
22%
LT 1 yr in job
43%
Psychology
12%
Clinician
68%
Data Collection
• Surveys administered in-person during 1st & 3rd learning
sessions (est. 10 months apart)
Measures
Rank
Column A –
Names
Who do you turn
to for professional
advice about
youth with trauma
histories? Please
list their name
and organization
in order you
would contact
them.
Column B - Communication
In the past 6 months, how frequently have you communicated or been in
contact with this person via in-person contact, telephone, or email? (Circle
the most accurate number from the answer scale below for each person.)
Not Once
1-2
times
About
once/
month
About
every 2
weeks
About
once/
week
About
daily
Many
times
daily
1
2
3
4
5
6
7
Name:
1.
Organization:
……
• Nominate up to 5 sources of professional advice
• 422 Unique individuals nominated across both waves of data collection
Analysis
1. Compare Composition of Professional Advice Networks
• Clinician Ego-Network at LS1 and LS3
• Calculate and compare Exposure (% of Ego-net) using
paired samples t-test in Stata 13 (Valente, 2010)
Content
Experts
Peers at
Home
Agency
LC Peers
at other
agencies
2. Compare Network Structure
• Visualize
• Network Descriptives (R - sna, igraph)
Private
Practice
Other
Ego-Net: Size of Professional Advice
Networks
Ego-Net Size*
Number of Nominations
100%
5
480%
360%
240%
3.9
3.6
LS1
LS3
120%
0%
*t(131)=2.06, p<.05
Ego-Net: Composition of Professional
Advice Networks
100%
0.15
Exposure
80%
0.06
0.03
60%
Other
Private Practice
Peers-Other Agencies
40%
0.72
Peers-Home
Experts
20%
0%
0.05
LS1
LS3
Ego-Net: Composition of Professional
Advice Networks
100%
0.15
Exposure
80%
0.07
0.03
0.05
0.06
0.03
**Other:
T(131)=-3.41, p<.001
**Private Practice:
T(131)=-3.24, p<.001
60%
Other
0.66
Private Practice
Peers-Other Agencies
40%
0.72
Peers-Home
Experts
20%
0.20
0%
0.05
LS1
LS3
**Experts:
T(131)=6.60, p<.001
Whole Network Structure – LS 1
Node = Person
N=422
Isolate = Person w/no
ties
N=74
Line = Nomination/Tie
N=2487
Components
Diamond = Faculty Expert
N=5
Compare Network Structure
Learning Session 1
Learning Session 3
Compare Network Structure
Learning Session 1
N
Isolates
Density
Centralization (in-degree)
Clustering (weighted)
422
74
0.014
0.097
0.293
Learning Session 3
422
177
0.013
.182
.356
Compare Network Structure
Learning Session 1
N
Isolates
Density
Centralization (in-degree)
Clustering (weighted)
Reciprocity
Weighted Reciprocity
422
74
0.014
0.097
0.293
0.164
0.188
Learning Session 3
422
177
0.013
.182
.356
0.227
0.246
Compare Network Structure
Learning Session 1
N
Isolates
Density
Centralization (in-degree)
Clustering (weighted)
Reciprocity
Weighted Reciprocity
Agency Homophily
422
74
0.014
0.097
0.293
0.164
0.188
89.08
Learning Session 3
422
177
0.013
.182
.356
0.227
0.246
77.72
Limitations
• Generalizeability
• 1 region
• No comparison/control
• Was the LC responsible for making net change?
• What elements of the LC??
• Measurement validity
• Self-report measures
• Drop-Out/Missing data
• Some participated in only one wave of data collection
• Drop-Out
• Opt-Out
• Snow-Out (winter weather during one LS)
Summary of Findings
Clinician-Level
• Clinicians rely on colleagues at their home agency
• Exposure to faculty experts increased
• Slight reduction in exposure to external sources of advice
(perhaps because of coaching+consultation)
Whole Network
• Centralization around Faculty Experts
• Reciprocity
Implications
• For Learning Collaborative Organizers
• Provide additional opportunities for participants to network across
organizational boundaries.
• For Policy Makers and Administrators
• Benefits of local experts/knowledge leaders for scale-up initiatives.
• Potential for sustainment?
• Integration of local service delivery system (in terms of advice
sharing)
• Small changes at the individual clinician-level can translate to big
changes at the systems-level.
Future Research Questions:
• Why do professional advice ties change?
• LC Components: LS? Coaching? LS + Coaching?
• Network dynamics? Readiness for implementation? Supportive
climate?
• Do professional advice networks have a role in implementation
success?
• What is the relationship between ego-net composition, position in the
network, etc. with implementation fidelity? Treatment outcomes?
• Why do some clinicians/organizations remain disconnected?
• Initial Readiness?
• Innovation-values fit?
Contact information
Alicia Bunger
[email protected]
References
Aarons, GA, Hurlburt, M, & Horwitz, SM. (2011). Advancing a Conceptual Model of Evidence-Based
Practice Implementation in Public Service Sectors. Administration and policy in mental health, 38(1),
4–23.
Damschroder, LJ, Aron, DC, Keith, RE., …(2009). Fostering implementation of health services
research findings into practice: a consolidated framework for advancing implementation science.
Implementation science, 4, 50.
IHI (2003). The Breakthrough Series: IHI’s Collaborative Model for Achieving Breakthrough
Improvement. Cambridge, MA. Retrieved from http://www.ihi.org/IHI/Results/WhitePapers/
Mittman, BS. (2004). Creating the Evidence Base for Quality Improvement Collaboratives. Ann Intern
Med, 140(11), 897–901.
Nadeem, E, Olin, SS, Hill, LC, Hoagwood, KE, & Horwitz, SM. (2013). Understanding the components
of quality improvement collaboratives: a systematic literature review. The Milbank quarterly, 91(2),
354–94.
Powell, BJ, McMillen, JC, Proctor, EK … (2011). A Compilation of Strategies for Implementing Clinical
Innovations in Health and Mental Health. Medical care research and review, 69(2), 123–157.
Schouten, LMT, Hulscher, MEJ, van Everdingen, JJE, …(2008). Evidence for the impact of quality
improvement collaboratives: systematic review. BMJ (Clinical research ed.), 336(7659), 1491–4.
Valente, TW (2010). Social networks and health: models, methods, and applications . Oxford
University Press.
Acknowledgements
• Missouri Academy of Child Trauma Studies (MoACTS) at
the Child Advocacy Center of Greater St. Louis (UMSL).
• NIMH
• Postdoctoral Traineeship (T32 MH019117) sponsored by UNC-CH & Duke
(Bunger)
• Predoctoral traineeship (F31 MH098478) (Powell)
• NIMH/VA
• Implementation Research Institute (R25 MH080916-01A2) (WUSTL)
(Bunger & Hanson)
• Doris Duke Charitable Foundation
• Fellowship for the Promotion of Child Well-Being (Powell)