Assessing Organizational Context for Implementation Research Gregory A. Aarons, Ph.D. University of California, San Diego Department of Psychiatry Presented at the Administration for Children and.

Download Report

Transcript Assessing Organizational Context for Implementation Research Gregory A. Aarons, Ph.D. University of California, San Diego Department of Psychiatry Presented at the Administration for Children and.

Assessing Organizational Context for Implementation Research

Gregory A. Aarons, Ph.D.

University of California, San Diego Department of Psychiatry Presented at the Administration for Children and Families Conference: Improving Implementation Research Methods for Behavioral and Social Science. Silver Spring, MD: September 20-21, 2010

Overarching Questions

What aspects of organizational context may be implicated in implementation effectiveness? How do we better understand the process and outcomes of implementation efforts? Examples

– Quantitative – Qualitative – Mixed-Methods

Organizational Context of Social Services

Source: Glisson & Schoenwald (2005)

Levels of Change

Four Levels of Change for Assessing Performance Improvement Assumptions about Change

Larger System/ Environment Reimbursement, legal, and regulatory policies are key Organization Group / Team Individual Structure and strategy are key Cooperation, coordination, & shared knowledge are key Knowledge, skill, and expertise are key

Source: Shortell, (2004) Medical Care Research and Review

Intervention Strategies

Evidence Based Practices

Conceptual Model of Implementation Research

Implementation Strategies

Systems Environment Organizational Group/Learning Supervision Providers Consumers Implementation Outcomes Feasibility Fidelity Reach Acceptability

Outcomes

Service Outcomes Efficiency Safety Effectiveness Equity Sustainability Uptake Costs *IOM Standards of Care Patient-Centered Timeliness Client Outcomes Symptoms Functioning Satisfaction Quality of Life Implementation Research Methods

Source: Proctor, Landsverk, Aarons, Chambers, Glisson, Mittman (2009)

Conceptual Model of Implementation Phases and Context Levels

Source: Aarons, Hurlburt, & Horwitz,(In Review)

Organizational Culture and Climate

Organizational culture and climate are related but distinct constructs – Aarons & Sawitzky, 2006; Denison, 1996; Schneider, Ehrhart, & Macey, 2010 32 definitions of organizational climate and 54 definitions of organizational culture with some overlap between these constructs – Verbeke, Volgering, Hessels 1998

Organizational Culture

“…shared basic assumptions… considered valid and, the correct way to perceive, think, and feel in” an organization – Schein, 2004 Behavioral norms and expectations for the ways in which people behave in an organization – Cooke & Rousseau, 1988 “the way things are done here…” Organizations with cultures that are more supportive of employees and that are adaptable to changes in the environment are more effective – (Kotter & Heskett, 1992; Wilderom, glunk, & Maslowski, 2000)

Competing Values Framework

Source: Kalliath, Bluedorn, & Gillespie (1998) Ed & Psych Measurement

Organizational Culture Inventory

Source: human synergistics 1987-2007

Organizational Climate

Employees’ perceptions of the policies, practices, and procedures and the kinds of behaviors that get rewarded, supported, and expected in a setting – Schneider, 1990 Employee’s perceptions of the psychological impact of the work environment on his or her own well being and functioning in the organization – Glisson et al., 2007; Glisson and James, 2002 Focused or strategic climates – e.g., Diversity climate, Service climate, Safety climate Kuenzi & Schminke, 2009; Schneider et al., 2010

Organizational Social Context

Fig. 1 Confirmatory factor analysis (CFA) of organizational social context (OSC)

Source: Glisson et al. 2007

Quantitative Analyses

Relationship of Organizational Culture and Climate with Attitudes Toward Adopting EBP

Source: Aarons & Sawitzky (2006)

Organizational Climate Partial Mediation of the Effect of Organizational Culture on Work Attitudes and Turnover

Source: Aarons & Sawitzky (2006)

Leader Member Exchange .84*** /.91*** Transformational Leadership .74** /-.11

-.09

/.89*** Team Climate for Innovation .29** /.22

Attitudes Toward Adopting EBP Figure 1. Multigroup Clustered Path Analysis: Association of Transformational Leadership and Leader Member Exchange with Team Climate for Innovation and Team Climate for Innovation with Staff Attitudes Toward Innovation Adoption During Innovation Implementation compared to Services as Usual. Note: N=140; Teams Implementing the SafeCare (n=85) / Teams Providing Services as Usual (n=55);

χ

2 (4)=1.105;

p

=.894; CFI=1.000, TLI=1.037, RMSEA=0.000, SRMR=0.013; *p<.05, **p<.01, ***p<.001

Source: Aarons & Sommerfeld (In Review)

Agency Type 0.18* Organizational Support for EBP 0.11** EBP Use 0.15* 0.16* 0.09ns

Attitudes Toward EBP Figure 3. Path model of partial mediation effects of agency type on organizational support for evidence based practice and attitudes toward evidence-based practice, effect of organizational support for evidence based practice on provider attitudes toward evidence-based practice, and effect of organizational support for evidence-based practice on provider use of evidence-based practice. N = 170; AIC = 2437.127, SBIC = 2436.638; *p<0.05, **p<0.01 (one-tailed)

Source: Aarons, Sommerfeld, & Walrath-Greene (2009)

Qualitative Analyses

Implementation of a Computerized HIV Clinical Reminder (PIs, Goetz, Asch; VA QUERI) Screening rates of HIV in VA very low Question: What type of service improvement can improve HIV testing with appropriate individuals – Develop software to identify high-risk vets and remind providers to recommend HIV test Software Assesses EMR Initiates reminder – regardless of service (no wrong door) Ethnographic process evaluation Part of QUERI (Quality Enhancement Research Initiative)

Sobo , Bowman, Aarons, Asch, Gifford (2008) Human Organization

Implementation of a Computerized HIV Clinical Reminder

Sobo , Bowman, Aarons, Asch, Gifford (2008) Human Organization

Results: Implementation of a Computerized HIV Clinical Reminder Key organizational sub-culture differences in stakeholder agendas impacted implementation – (e.g., physician, nursing, information technology, laboratory) Emergence of strategic communication processes that, despite their immediate utility, sometimes undermined progress and threatened long term relationships – focus on the local – information reconfigurations – partiality

Sobo , Bowman, Aarons, Asch, Gifford (2008) Human Organization

Mixed-Methods

Combines qualitative and quantitative approaches Collaboration between quantitative and qualitative researchers during the study design phase Open acknowledgement of the philosophical approaches brought to the study by various team members Shared willingness to negotiate emerging problems Should include mixing of design, analyses, and results

Source: Willging et al., (2007)

Mixed-Methods Research Offers Several Advantages over Single-Method Approaches

Combine the qualitative and quantitative approaches into the research methodology of a single study or multi-phased study Simultaneously answer confirmatory and exploratory questions, and therefore verify and generate theory in the same study – Teddlie & Tashakkori, 2003

Mixed-Methods Study of Statewide EBP Implementation (NIMH PI: Aarons) Implementation of SafeCare® in Oklahoma’s Statewide Children’s Services System Combines exploratory and confirmatory approaches – Mixed Methods Equal quantitative and qualitative components Longitudinal at organization/team level Requires collaboration and ongoing relationship building and maintenance

Implementation Outcomes Effect of EBP Implementation on Staff Retention

Annualized Turnover by Condition Consultation Yes No Yes SafeCare® No

14.9%

41.5% 33.4% 37.6% Figure 1. Kaplan-Meier Survival Function Estimates (Retention Probability) by Study Condition. Note: SC/M = participating in SafeCare and fidelity monitoring; SC/Non = participating in SafeCare, but not fidelity monitoring; SAU/M = services as usual and receiving fidelity monitoring; and SAU/Non = services as usual and not receiving fidelity monitoring. N=153.

Source: Aarons, Sommerfeld, et al (2009), Journal of Consulting and Clinical Psychology

CCM perspective on EBP implementation and turnover

THEME Having to learn new skills and dissatisfaction with SC or with being monitored may have contributed to some of the older CCMs to quit their jobs.

QUOTE

“And the CCM’s that I see having a problem adapting; actually the ones that have the trouble adapting were excellent case managers, but they have a style that’s pretty free flowing and they just aren’t adjusting as well as you would like to see them.”

Learning skills like SC were motivations to stay with current employers.

“I mean if they don’t all succeed and I never, ever am going to expect that they all succeed, because you have those that are not just to do it and work at it successfully. But when you see the percentage of them that do succeed is so much higher than those that don’t, it really makes it worth it. And that’s the whole goal for me with my families is for them to succeed.”

Agency/program director perspective on EBP implementation and turnover THEME QUOTE Implementation of EBPs helps to recruit and retain new staff.

“…its like any kind of change within, you know, staff. There’s gonna be some resistance. But I feel like, well, as evidenced from our turnover. We have very little turnover here. And, you know, if they weren’t happy, they wouldn’t stay.”

Learning new skills like SC might inspire CCMs to seek higher paying jobs elsewhere.

“It is helping recruit and retain good staff to recognize that, wow, [agency] is the place where you can go and get trained in the latest evidence-based practices and have good support, good supervision, and that sort of is part of our goal is to be recognized for that”

Contact

e-mail [email protected]

web: http://psychiatry.ucsd.edu/faculty/gaarons.html